Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 966421
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 16, 20262026-05-16T02:09:16+00:00 2026-05-16T02:09:16+00:00

Let me preface this by saying that I’m pretty new to Java. I have

  • 0

Let me preface this by saying that I’m pretty new to Java.

I have a file that contains a single line. The size of the file is about 200MB. I need to insert a newline character after every 309th character. I believe I have the code to do this properly, but I keep running into memory errors. I’ve tried increasing the heap space to no avail.

Is there a less memory-intensive way of handling this?

BufferedReader r = new BufferedReader(new FileReader(fileName));

String line;

while ((line=r.readLine()) != null) {
  System.out.println(line.replaceAll("(.{309})", "$1\n"));
}
  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-16T02:09:16+00:00Added an answer on May 16, 2026 at 2:09 am

    Your code has two problems:

    1. You’re loading the entire file into memory at once, assuming it is a single line so you’ll need at least 200MB of heap space for that; and

    2. It’s a horribly inefficient way of adding newlines to use a regex like that. The straightforward code solution will be an order of magnitude faster.

    Both of these problems are easily fixed.

    Use a FileReader and FileWriter to load 309 characters at a time, append a newline and write those out.

    Update: added a test of both character-by-character and buffered reading. The buffered reading actually adds a lot of complexity because you need to cater for the possible (but typically exceedingly rare) situation where a read() returns less bytes than you ask for and there are still bytes to read.

    Firstly the simple version:

    private static void charRead(boolean verifyHash) {
      Reader in = null;
      Writer out = null;
      long start = System.nanoTime();
      long wrote = 0;
      MessageDigest md = null;
      try {
        if (verifyHash) {
          md = MessageDigest.getInstance("SHA1");
        }
        in = new BufferedReader(new FileReader(IN_FILE));
        out = new BufferedWriter(new FileWriter(CHAR_FILE));
        int count = 0;
        for (int c = in.read(); c != -1; c = in.read()) {
          if (verifyHash) {
            md.update((byte) c);
          }
          out.write(c);
          wrote++;
          if (++count >= COUNT) {
            if (verifyHash) {
              md.update((byte) '\n');
            }
            out.write("\n");
            wrote++;
            count = 0;
          }
        }
      } catch (IOException e) {
        throw new RuntimeException(e);
      } catch (NoSuchAlgorithmException e) {
        throw new RuntimeException(e);
      } finally {
        safeClose(in);
        safeClose(out);
        long end = System.nanoTime();
        System.out.printf("Created %s size %,d in %,.3f seconds. Hash: %s%n",
            CHAR_FILE, wrote, (end - start) / 1000000000.0d, hash(md, verifyHash));
      }
    }
    

    And the “block” version:

    private static void blockRead(boolean verifyHash) {
      Reader in = null;
      Writer out = null;
      long start = System.nanoTime();
      long wrote = 0;
      MessageDigest md = null;
      try {
        if (verifyHash) {
          md = MessageDigest.getInstance("SHA1");
        }
        in = new BufferedReader(new FileReader(IN_FILE));
        out = new BufferedWriter(new FileWriter(BLOCK_FILE));
        char[] buf = new char[COUNT + 1]; // leave a space for the newline
        int lastRead = in.read(buf, 0, COUNT); // read in 309 chars at a time
        while (lastRead != -1) { // end of file
          // technically less than 309 characters may have been read
          // this is very unusual but possible so we need to keep
          // reading until we get all the characters we want
          int totalRead = lastRead;
          while (totalRead < COUNT) {
            lastRead = in.read(buf, totalRead, COUNT - totalRead);
            if (lastRead == -1) {
              break;
            } else {
              totalRead++;
            }
          }
    
          // if we get -1, it'll eventually signal an exit but first
          // we must write any characters we have read
          // note: it is assumed that the trailing number, which may be
          // less than 309 will still have a newline appended. this may
          // note be the case
          if (totalRead == COUNT) {
            buf[totalRead++] = '\n';
          }
          if (totalRead > 0) {
            out.write(buf, 0, totalRead);
            if (verifyHash) {
              md.update(new String(buf, 0, totalRead).getBytes("UTF-8"));
            }
            wrote += totalRead;
          }
    
          // don't try and read again if we've already hit EOF
          if (lastRead != -1) {
            lastRead = in.read(buf, 0, 309);
          }
        }
      } catch (IOException e) {
        throw new RuntimeException(e);
      } catch (NoSuchAlgorithmException e) {
        throw new RuntimeException(e);
      } finally {
        safeClose(in);
        safeClose(out);
        long end = System.nanoTime();
        System.out.printf("Created %s size %,d in %,.3f seconds. Hash: %s%n",
            CHAR_FILE, wrote, (end - start) / 1000000000.0d, hash(md, verifyHash));
      }
    }
    

    And a method to create a test file:

    private static void createFile() {
      Writer out = null;
      long start = System.nanoTime();
      try {
        out = new BufferedWriter(new FileWriter(IN_FILE));
        Random r = new Random();
        for (int i = 0; i < SIZE; i++) {
          out.write(CHARS[r.nextInt(CHARS.length)]);
        }
      } catch (IOException e) {
        throw new RuntimeException(e);
      } finally {
        safeClose(out);
        long end = System.nanoTime();
        System.out.printf("Created %s size %,d in %,.3f seconds%n",
          IN_FILE, SIZE, (end - start) / 1000000000.0d);
      }
    }
    

    These all assume:

    private static final int SIZE = 200000000;
    private static final int COUNT = 309;
    private static final char[] CHARS;
    private static final char[] BYTES = new char[]{'0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f'};
    private static final String IN_FILE = "E:\\temp\\in.dat";
    private static final String CHAR_FILE = "E:\\temp\\char.dat";
    private static final String BLOCK_FILE = "E:\\temp\\block.dat";
    
    static {
      char[] chars = new char[1000];
      int nchars = 0;
      for (char c = 'a'; c <= 'z'; c++) {
        chars[nchars++] = c;
        chars[nchars++] = Character.toUpperCase(c);
      }
      for (char c = '0'; c <= '9'; c++) {
        chars[nchars++] = c;
      }
      chars[nchars++] = ' ';
      CHARS = new char[nchars];
      System.arraycopy(chars, 0, CHARS, 0, nchars);
    }
    

    Running this test:

    public static void main(String[] args) {
      if (!new File(IN_FILE).exists()) {
        createFile();
      }
      charRead(true);
      charRead(true);
      charRead(false);
      charRead(false);
      blockRead(true);
      blockRead(true);
      blockRead(false);
      blockRead(false);
    }
    

    Gives this result (Intel Q9450, Windows 7 64bit, 8GB RAM, test run on 7200rpm 1.5TB drive):

    Created E:\temp\char.dat size 200,647,249 in 29.690 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 18.177 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 7.911 seconds. Hash: (not calculated)
    Created E:\temp\char.dat size 200,647,249 in 7.867 seconds. Hash: (not calculated)
    Created E:\temp\char.dat size 200,647,249 in 8.018 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 7.949 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 3.958 seconds. Hash: (not calculated)
    Created E:\temp\char.dat size 200,647,249 in 3.909 seconds. Hash: (not calculated)
    

    Conclusion: the SHA1 hash verification is really expensive, which is why I ran versions with and without. Basically after warm up the “efficient” version is only about 2x as fast. I guess by this time the file is effectively in memory.

    If I reverse the order of the block and char reads, the result is:

    Created E:\temp\char.dat size 200,647,249 in 8.071 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 8.087 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 4.128 seconds. Hash: (not calculated)
    Created E:\temp\char.dat size 200,647,249 in 3.918 seconds. Hash: (not calculated)
    Created E:\temp\char.dat size 200,647,249 in 18.020 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 17.953 seconds. Hash: 0x22ce9e17e17a67e5ea6f8fe929d2ce4780e8ffa4
    Created E:\temp\char.dat size 200,647,249 in 7.879 seconds. Hash: (not calculated)
    Created E:\temp\char.dat size 200,647,249 in 8.016 seconds. Hash: (not calculated)
    

    It’s interesting that the character-by-character version takes a far bigger initial hit on the first read of the file.

    So, as per usual, it’s a choice between efficiency and simplicity.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.