Greetings , I get huge number of records from database and write into a file.I was wondering what the best way to write huge files. (1Gb - 10Gb).
Currently I am using BufferedWriter
BufferedWriter mbrWriter=new BufferedWriter(new FileWriter(memberCSV));
while(done){
//do writings
}
mbrWriter.close();
If you really insist using Java for this, then the best way would be to write immediately as soon as the data comes in and thus not to collect all the data from ResultSet
into Java's memory first. You would need at least that much of free memory in Java otherwise.
Thus, do e.g.
while (resultSet.next()) {
writer.write(resultSet.getString("columnname"));
// ...
}
That said, most decent DB's ships with builtin export-to-CSV capabilities which are undoubtely way more efficient than you could ever do in Java. You didn't mention which one you're using, but if it was for example MySQL, you could have used the LOAD DATA INFILE
for this. Just refer the DB-specific documentation. Hope this gives new insights.
The default buffer size for a BufferedWriter is 8192. If you are going to be writing squigabyte files, you might want to increase this using the 2 argument constructor; e.g.
int buffSize = ... // 1 megabyte or so
BufferedWriter mbrWriter = new BufferedWriter(new FileWriter(memberCSV), buffSize);
This should reduce the number of syscalls needed to write the file.
But I doubt that this would make more than a couple of percent difference. Pulling rows from the resultset will probably be the main performance bottleneck. For significant improvements in performance you'd need to use the database's native bulk export facilities.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With