Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Fastest way to incrementally read a large file

Tags:

java

file-io

nio

When given a buffer of MAX_BUFFER_SIZE, and a file that far exceeds it, how can one:

  1. Read the file in blocks of MAX_BUFFER_SIZE?
  2. Do it as fast as possible

I tried using NIO

    RandomAccessFile aFile = new RandomAccessFile(fileName, "r");
    FileChannel inChannel = aFile.getChannel();

    ByteBuffer buffer = ByteBuffer.allocate(CAPARICY);

    int bytesRead = inChannel.read(buffer);

    buffer.flip();

        while (buffer.hasRemaining()) {
            buffer.get();
        }

        buffer.clear();
        bytesRead = inChannel.read(buffer);

    aFile.close();

And regular IO

    InputStream in = new FileInputStream(fileName);

    long length = fileName.length();

    if (length > Integer.MAX_VALUE) {
        throw new IOException("File is too large!");
    }

    byte[] bytes = new byte[(int) length];

    int offset = 0;

    int numRead = 0;

    while (offset < bytes.length
            && (numRead = in.read(bytes, offset, bytes.length - offset)) >= 0) {
        offset += numRead;
    }

    if (offset < bytes.length) {
        throw new IOException("Could not completely read file " + fileName);
    }

    in.close();

Turns out that regular IO is about 100 times faster in doing the same thing as NIO. Am i missing something? Is this expected? Is there a faster way to read the file in buffer chunks?

Ultimately i am working with a large file i don't have memory for to read it all at once. Instead, I'd like to read it incrementally in blocks that would then be used for processing.

like image 775
James Raitsev Avatar asked Jan 28 '12 16:01

James Raitsev


2 Answers

If you want to make your first example faster

FileChannel inChannel = new FileInputStream(fileName).getChannel();
ByteBuffer buffer = ByteBuffer.allocateDirect(CAPACITY);

while(inChannel.read(buffer) > 0)
    buffer.clear(); // do something with the data and clear/compact it.

inChannel.close();

If you want it to be even faster.

FileChannel inChannel = new RandomAccessFile(fileName, "r").getChannel();
MappedByteBuffer buffer = inChannel.map(FileChannel.MapMode.READ_ONLY, 0, inChannel.size());
// access the buffer as you wish.
inChannel.close();

This can take 10 - 20 micro-seconds for files up to 2 GB in size.

like image 95
Peter Lawrey Avatar answered Sep 29 '22 17:09

Peter Lawrey


Assuming that you need to read the entire file into memory at once (as you're currently doing), neither reading smaller chunks nor NIO are going to help you here.

In fact, you'd probably be best reading larger chunks - which your regular IO code is automatically doing for you.

Your NIO code is currently slower, because you're only reading one byte at a time (using buffer.get();).

If you want to process in chunks - for example, transferring between streams - here is a standard way of doing it without NIO:

InputStream is = ...;
OutputStream os = ...;

byte buffer[] = new byte[1024];
int read;
while((read = is.read(buffer)) != -1){
    os.write(buffer, 0, read);
}

This uses a buffer size of only 1 KB, but can transfer an unlimited amount of data.

(If you extend your answer with details of what you're actually looking to do at a functional level, I could further improve this to a better answer.)

like image 31
ziesemer Avatar answered Sep 29 '22 18:09

ziesemer