Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to calculate the optimum chunk size for uploading large files

Is there such a thing as an optimum chunk size for processing large files? I have an upload service (WCF) which is used to accept file uploads ranging from several hundred megabytes.

I've experimented with 4KB, 8KB through to 1MB chunk sizes. Bigger chunk sizes is good for performance (faster processing) but it comes at the cost of memory.

So, is there way to work out the optimum chunk size at the moment of uploading files. How would one go about doing such calculations? Would it be a combination of available memory and the client, CPU and network bandwidth which determines the optimum size?

Cheers

EDIT: Probably should mention that the client app will be in silverlight.

like image 259
Fixer Avatar asked Sep 09 '10 03:09

Fixer


1 Answers

If you are concerned about running out of resources, then the optimum is probably best determined by evaluating your peak upload concurrency against your system's available memory. How many simultaneous uploads you have in progress at a time would be the key critical variable in any calculation you might do. All you have to do is make sure you have enough memory to handle the upload concurrency, and that's rather trivial to achieve. Memory is cheap and you will likely run out of network bandwidth long before you get to the point where your concurrency would overrun your memory availability.

On the performance side, this isn't the kind of thing you can really optimize much during app design and development. You have to have the system in place, users uploading files for real, and then you can monitor actual runtime performance.

Try a chunk size that matches your network's TCP/IP window size. That's about as optimal as you'd really need to get at design time.

like image 110
Stephen M. Redd Avatar answered Sep 24 '22 20:09

Stephen M. Redd