I am reading in a text file using FileInputStream that puts the file contents into a byte array. I then convert the byte array into a String using new String(byte).
Once I have the string I’m using String.split("\n") to split the file into a String array and then taking that string array and parsing it by doing a String.split(",") and hold the contents in an Arraylist.
I have a 200MB+ file and it is running out of memory when I start the JVM up with a 1GB of memory. I know I must be doing something in correctly somewhere, I’m just not sure if the way I’m parsing is incorrect or the data structure I’m using.
It is also taking me about 12 seconds to parse the file which seems like a lot of time. Can anyone point out what I may be doing that is causing me to run out of memory and what may be causing my program to run slow?
The contents of the file look as shown below:
"12334", "100", "1.233", "TEST", "TEXT", "1234"
"12334", "100", "1.233", "TEST", "TEXT", "1234"
.
.
.
"12334", "100", "1.233", "TEST", "TEXT", "1234"
Thanks
It sounds like you’re doing something wrong to me – a whole lotta object creation going on.
How representative is that “test” file? What are you really doing with that data? If that’s typical of what you really have, I’d say there’s lots of repetition in that data.
If it’s all going to be in Strings anyway, start with a BufferedReader to read each line. Pre-allocate that List to a size that’s close to what you need so you don’t waste resources adding to it each time. Split each of those lines at the comma; be sure to strip off the double quotes.
You might want to ask yourself: “Why do I need this whole file in memory all at once?” Can you read a little, process a little, and never have the whole thing in memory at once? Only you know your problem well enough to answer.
Maybe you can fire up jvisualvm if you have JDK 6 and see what’s going on with memory. That would be a great clue.