Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Fastest Way To Read and Write Large Files Line By Line in Java

Tags:

I have been searching a lot for the fastest way to read and write again a large files (0.5 - 1 GB) in java with limited memory (about 64MB). Each line in the file represents a record, so I need to get them line by line. The file is a normal text file.

I tried BufferedReader and BufferedWriter but it doesn't seem to be the best option. It takes about 35 seconds to read and write a file of size 0.5 GB, only read write with no processing. I think the bottleneck here is writing as reading alone takes about 10 seconds.

I tried to read array of bytes, but then searching for lines in each array that was read takes more time.

Any suggestions please? Thanks

like image 263
user1785771 Avatar asked Oct 31 '12 10:10

user1785771


People also ask

What is the easiest way to read text files line by line in Java 8?

Java 8 has added a new method called lines() in the Files class which can be used to read a file line by line in Java. The beauty of this method is that it reads all lines from a file as Stream of String, which is populated lazily as the stream is consumed.


2 Answers

I suspect your real problem is that you have limited hardware and what you do is software won't make much difference. If you have plenty of memory and CPU, more advanced tricks can help, but if you are just waiting on your hard drive because the file is not cached, it won't make much difference.

BTW: 500 MB in 10 secs or 50 MB/sec is a typical read speed for a HDD.

Try running the following to see at what point your system is unable to cache the file efficiently.

public static void main(String... args) throws IOException {     for (int mb : new int[]{50, 100, 250, 500, 1000, 2000})         testFileSize(mb); }  private static void testFileSize(int mb) throws IOException {     File file = File.createTempFile("test", ".txt");     file.deleteOnExit();     char[] chars = new char[1024];     Arrays.fill(chars, 'A');     String longLine = new String(chars);     long start1 = System.nanoTime();     PrintWriter pw = new PrintWriter(new FileWriter(file));     for (int i = 0; i < mb * 1024; i++)         pw.println(longLine);     pw.close();     long time1 = System.nanoTime() - start1;     System.out.printf("Took %.3f seconds to write to a %d MB, file rate: %.1f MB/s%n",             time1 / 1e9, file.length() >> 20, file.length() * 1000.0 / time1);      long start2 = System.nanoTime();     BufferedReader br = new BufferedReader(new FileReader(file));     for (String line; (line = br.readLine()) != null; ) {     }     br.close();     long time2 = System.nanoTime() - start2;     System.out.printf("Took %.3f seconds to read to a %d MB file, rate: %.1f MB/s%n",             time2 / 1e9, file.length() >> 20, file.length() * 1000.0 / time2);     file.delete(); } 

On a Linux machine with lots of memory.

Took 0.395 seconds to write to a 50 MB, file rate: 133.0 MB/s Took 0.375 seconds to read to a 50 MB file, rate: 140.0 MB/s Took 0.669 seconds to write to a 100 MB, file rate: 156.9 MB/s Took 0.569 seconds to read to a 100 MB file, rate: 184.6 MB/s Took 1.585 seconds to write to a 250 MB, file rate: 165.5 MB/s Took 1.274 seconds to read to a 250 MB file, rate: 206.0 MB/s Took 2.513 seconds to write to a 500 MB, file rate: 208.8 MB/s Took 2.332 seconds to read to a 500 MB file, rate: 225.1 MB/s Took 5.094 seconds to write to a 1000 MB, file rate: 206.0 MB/s Took 5.041 seconds to read to a 1000 MB file, rate: 208.2 MB/s Took 11.509 seconds to write to a 2001 MB, file rate: 182.4 MB/s Took 9.681 seconds to read to a 2001 MB file, rate: 216.8 MB/s 

On a windows machine with lots of memory.

Took 0.376 seconds to write to a 50 MB, file rate: 139.7 MB/s Took 0.401 seconds to read to a 50 MB file, rate: 131.1 MB/s Took 0.517 seconds to write to a 100 MB, file rate: 203.1 MB/s Took 0.520 seconds to read to a 100 MB file, rate: 201.9 MB/s Took 1.344 seconds to write to a 250 MB, file rate: 195.4 MB/s Took 1.387 seconds to read to a 250 MB file, rate: 189.4 MB/s Took 2.368 seconds to write to a 500 MB, file rate: 221.8 MB/s Took 2.454 seconds to read to a 500 MB file, rate: 214.1 MB/s Took 4.985 seconds to write to a 1001 MB, file rate: 210.7 MB/s Took 5.132 seconds to read to a 1001 MB file, rate: 204.7 MB/s Took 10.276 seconds to write to a 2003 MB, file rate: 204.5 MB/s Took 9.964 seconds to read to a 2003 MB file, rate: 210.9 MB/s 
like image 150
Peter Lawrey Avatar answered Sep 27 '22 17:09

Peter Lawrey


The first thing I would try is to increase the buffer size of the BufferedReader and BufferedWriter. The default buffer sizes are not documented, but at least in the Oracle VM they are 8192 characters, which won't bring much performance advantage.

If you only need to make a copy of the file (and don't need actual access to the data), I would either drop the Reader/Writer approach and work directly with InputStream and OutputStream using a byte array as buffer:

FileInputStream fis = new FileInputStream("d:/test.txt"); FileOutputStream fos = new FileOutputStream("d:/test2.txt"); byte[] b = new byte[bufferSize]; int r; while ((r=fis.read(b))>=0) {     fos.write(b, 0, r);          } fis.close(); fos.close(); 

or actually use NIO:

FileChannel in = new RandomAccessFile("d:/test.txt", "r").getChannel(); FileChannel out = new RandomAccessFile("d:/test2.txt", "rw").getChannel(); out.transferFrom(in, 0, Long.MAX_VALUE); in.close(); out.close(); 

When benchmarking the different copy methods, I have however much larger differences (duration) between each run of the benchmark than between the different implementations. I/O caching (both on the OS level and the hard disk cache) plays a great role here and it is very difficult to say what is faster. On my hardware, copying a 1GB text file line by line using BufferedReader and BufferedWriter takes less than 5s in some runs and more than 30s in other.

like image 28
jarnbjo Avatar answered Sep 27 '22 16:09

jarnbjo