Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

java : writing large files?

Greetings , I get huge number of records from database and write into a file.I was wondering what the best way to write huge files. (1Gb - 10Gb).

Currently I am using BufferedWriter

BufferedWriter mbrWriter=new BufferedWriter(new FileWriter(memberCSV));
while(done){
 //do writings
}
mbrWriter.close();
like image 625
Ashika Umanga Umagiliya Avatar asked Jan 07 '10 02:01

Ashika Umanga Umagiliya


2 Answers

If you really insist using Java for this, then the best way would be to write immediately as soon as the data comes in and thus not to collect all the data from ResultSet into Java's memory first. You would need at least that much of free memory in Java otherwise.

Thus, do e.g.

while (resultSet.next()) {
    writer.write(resultSet.getString("columnname"));
    // ...
}

That said, most decent DB's ships with builtin export-to-CSV capabilities which are undoubtely way more efficient than you could ever do in Java. You didn't mention which one you're using, but if it was for example MySQL, you could have used the LOAD DATA INFILE for this. Just refer the DB-specific documentation. Hope this gives new insights.

like image 146
BalusC Avatar answered Nov 09 '22 21:11

BalusC


The default buffer size for a BufferedWriter is 8192. If you are going to be writing squigabyte files, you might want to increase this using the 2 argument constructor; e.g.

int buffSize = ... // 1 megabyte or so
BufferedWriter mbrWriter = new BufferedWriter(new FileWriter(memberCSV), buffSize);

This should reduce the number of syscalls needed to write the file.

But I doubt that this would make more than a couple of percent difference. Pulling rows from the resultset will probably be the main performance bottleneck. For significant improvements in performance you'd need to use the database's native bulk export facilities.

like image 21
Stephen C Avatar answered Nov 09 '22 22:11

Stephen C