Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the best buffer size when using BinaryReader to read big files (>1 GB)?

Tags:

I'm reading binary files and here is a sample:

public static byte[] ReadFully(Stream input)
{
    byte[] buffer = new byte[16*1024];
    int read;
    while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
    {
        ......
    }

}

Obviously the buffer size (16*1024) has a great role in performance. I've read that it depends on the I/O technology (SATA, SSD, SCSI, etc.) and also the fragment size of the partition which file exists on it (we can define during the formatting the partition).

But here is the question: Is there any formula or best practice to define the buffer size? Right now, I'm defining based on trial-and-error.

Edit: I've tested the application on my server with different buffer sizes, and I get the best performance with 4095*256*16 (16 MB)!!! 4096 is 4 seconds slower.

Here are some older posts which are very helpful but I can't still get the reason:

  • Faster (unsafe) BinaryReader in .NET

  • Optimum file buffer read size?

  • File I/O with streams - best memory buffer size

  • How do you determine the ideal buffer size when using FileInputStream?