Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Download a file programmatically in multiple chunks in parallel in restartable mode

I need to download a large file using HTTP protocol via a quite slow network connection. When doing it manually, the download speed sometimes is unbearably slow and the process sometimes freezes or terminates.

For manual downloads, the situation can be greatly improved by using a download manager (e.g. FDM) — a class of programs that was indispensable and extremely popular a decade or so ago, but whose usage quickly diminishes nowadays because of better and faster networking available — it starts multiple download sessions of the same file in parallel in chunks starting from different positions, automatically restarts failed or stale sessions, implements work balancing (after a successful download of a chunk splits some of the remaining chunks still being downloaded into two sessions), and eventually stitch all downloaded chunks into a complete single file. Overall, it allows to make file downloading robust and much faster on poor connections.

Now I am trying to implement the same download behavior in C# for automatic unattended downloads. I cannot see any of existing classes in .NET framework implementing this, so I am looking for advice how to implement it manually (possibly, with an aid of some open-source .NET libraries).

like image 405
Nik Z. Avatar asked Aug 02 '13 17:08

Nik Z.


1 Answers

This is possible using the HttpWebRequest.AddRange method which allows you to get the bytes of a file from a specific range. So when the file exists read the number of bytes and pass it through the HttpWebRequest.AddRange. See a code sample at Codeproject:

http://www.codeproject.com/Tips/307548/Resume-Suppoert-Downloading

Additional information about passing different types of ranges see: http://msdn.microsoft.com/en-us/library/4ds43y3w.aspx

like image 79
Martijn van Put Avatar answered Nov 03 '22 00:11

Martijn van Put