Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is file buffering by OS harmful?

I wrote a download library for my colleague. It writes downloaded data to files.

My colleagues found that the file stays small for a long time, even if 100 Mb data have been downloaded.

So they suggest that I should call flush() after every write() so it will not take up memory to buffer these data.

But I don't think 100 Mb of virtual memory is a lot and think maybe windows has its reason to buffer so much data.

What do you think about it?

like image 472
lilyonwind Avatar asked Dec 18 '22 06:12

lilyonwind


2 Answers

I would trust the operating system to tune itself appropriately, personally.

As for "flush immediately so as not to lose data if power dies" - if the power dies half way through a file, would you trust that the data you'd written was okay and resume the download from there? If so, maybe it's worth flushing early - but I'd weigh the complexity of resuming against the relative rarity of power failures, and just close the file when I'd read everything. If you see a half written file, delete it and download it again from scratch.

like image 147
Jon Skeet Avatar answered Jan 03 '23 00:01

Jon Skeet


Well, first you should investigate / debug what is going on. The problem might be elsewhere; for example Windows Explorer might not refresh the file size fast enough.

That said, you are right, generally if the VM system of the OS decides to buffer stuff in RAM, it has a good reason to do so, and you should not normally interfere. If there is a lot of free memory, it makes sense to use it, after all.

like image 21
sleske Avatar answered Jan 03 '23 00:01

sleske