Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Download large file from HTTP with resume/retry support in .NET?

How to implement downloading a large file (~500MB) from HTTP in my application? I want to support automatic resume/retry, so that when connection is disconnected, my application can try to reconnect to get the file and avoid re-downloading the downloaded part, if possible (I know, this depends on the server as well).

This is similar to the behaviour in download managers and some browsers.

like image 338
Louis Rhys Avatar asked Apr 29 '13 02:04

Louis Rhys


2 Answers

You can implement downloading from a web-server in C# from scratch in one of the two ways:

  1. Using the high-level APIs in System.Net such as HttpWebRequest, HttpWebResponse, FtpWebRequest, and other classes.

  2. Using the low-level APIs in System.Net.Sockets such as TcpClient, TcpListener and Socket classes.

The advantage of using the first approach is that you typically don't have to worry about the low level plumbing such as preparing and interpreting HTTP headers and handling the proxies, authentication, caching etc. The high-level classes do this for you and hence I prefer this approach.

Using the first method, a typical code to prepare an HTTP request to download a file from a url will look something like this:

HttpWebRequest request = (HttpWebRequest)WebRequest.Create(Url);
if (UseProxy)
{
    request.Proxy = new WebProxy(ProxyServer + ":" + ProxyPort.ToString());
    if (ProxyUsername.Length > 0)
        request.Proxy.Credentials = new NetworkCredential(ProxyUsername, ProxyPassword);
}
//HttpWebRequest hrequest = (HttpWebRequest)request;
//hrequest.AddRange(BytesRead); ::TODO: Work on this
if (BytesRead > 0) request.AddRange(BytesRead);

WebResponse response = request.GetResponse();
//result.MimeType = res.ContentType;
//result.LastModified = response.LastModified;
if (!resuming)//(Size == 0)
{
    //resuming = false;
    Size = (int)response.ContentLength;
    SizeInKB = (int)Size / 1024;
}
acceptRanges = String.Compare(response.Headers["Accept-Ranges"], "bytes", true) == 0;

//create network stream
ns = response.GetResponseStream();        

At the end of the above code, you get a network-stream object which you can then use to read the bytes of the remote file as if you are reading any other stream object. Now, whether the remote url supports resuming partial downloads by allowing you to read from any arbitary position is determined by the "Accept-Ranges" HTTP header as shown above. If this value is set to anything other than "bytes", then this feature won't be supported.

In fact, this code is part of a bigger opensource download-manager that I'm trying to implement in C#. You may refer to this application and see if anything can be helpful to you: http://scavenger.codeplex.com/

like image 175
Prahlad Yeri Avatar answered Nov 16 '22 08:11

Prahlad Yeri


There is open source .Net http file downloader with automatic resume/retry support (if server supports it), looks like it's exactly what you need, can try to use it:

https://github.com/Avira/.NetFileDownloader .

like image 3
leonid p Avatar answered Nov 16 '22 07:11

leonid p