I often see 4096 used as a default buffer size all over the place. Is there any reason why 4096 was selected as opposed to another value?
1024 is the exact amount of bytes in a kilobyte. All that line means is that they are creating a buffer of 16 KB. That's really all there is to it. If you want to go down the route of why there are 1024 bytes in a kilobyte and why it's a good idea to use that in programming, this would be a good place to start.
The upper limit for the maximum buffer size is 32768␠bytes (32␠KB).
A good buffer size for recording is 128 samples, but you can also get away with raising the buffer size up to 256 samples without being able to detect much latency in the signal. You can also decrease the buffer size below 128, but then some plugins and effects may not run in real time.
StreamWriter. BaseStream property is initialized using stream . [Note: The default buffer size can typically be around 4 KB.]
It is realy depending on your problem but a general compromise solution for the problem is 4KB. A good description for this choice you will find it under the below listed links:
File I/O with streams - best memory buffer size
http://social.msdn.microsoft.com/Forums/en-US/ed3c6dea-400e-489c-9a86-b43b3a78cc1c/quick-question-about-filestream-buffering?forum=csharpgeneral
C# FileStream : Optimal buffer size for writing large files?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With