This is a (basic) example of what I currently have:
foreach (var uri in uris)
{
using (var client = new WebClient())
{
client.Proxy = null;
client.DownloadStringCompleted += DownloadComplete;
client.DownloadStringAsync(uri);
}
}
Is there a faster way?
The important thing is to make the downloads in parallel, which you are already doing thanks to the Async download.
The download speed of your code is entirely dependent of the actual network transfer speed, so it is as good as it gets.
I believe you can make it a lot faster if you set Accept-Encoding header to gzip,deflate, if the server support gzip (modern web server should support).
The basic idea is to ask the server zip the content before downloading, normally for a common web page, you may get 50% less in size and hence you can save 50% time.
Look at this: http://csharpfeeds.com/post/5518/HttpWebRequest_and_GZip_Http_Responses.aspx
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With