Scenario:
Should I:
I will have to keep the things simple . Any suggestions ?
Casting doesn't change the amount of memory an object occupies. It just changes the runtime type.
If you can do those operations on a per-row basis, then just do the operation immediately inside the loop wherein you read a single line.
while ((line = reader.readLine()) != null) {
line = process(line);
writer.println(line);
}
This way you effectively end up with only a single line in Java's memory everytime instead of the whole file.
Or if you need to do those operations based on the entire CSV file (i.e., those operations are dependent on all rows), then your most efficient bet is to import the CSV file in a real SQL database and then use SQL statements to alter the data and then export it to CSV file again.
I'd recommend using a MappedByteBuffer (from NIO), that you can use to read a file too big to fit into memory. It maps only a region of the file into memory; once you're done reading this region (say, the first 10k), map the next one, and so on, until you've read the whole file. Memory-efficient and quite easy to implement.
Java Casts: like
Object a = new String();
String b (String) a;
are not expensive. -- No matter if you cast Strings or any other type.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With