Problem: I need to download hundreds of images from different hosts. Each host has anywhere between 20-hundreds of images.
Solution: using a new WebClient
every time a image needs to be downloaded through the WebClient's DownloadData
method.
Or would be better to keep a pool of open socket connections and making the http request using lower level calls?
Is it expensive to open/close a tcp connection (I'm assuming that is what WebClient does), so that using a pools sounds more efficient?
I believe the underlying infrastructure which WebClient
uses will already pool HTTP connections, so there's no need to do this. You may want to check using something like Wireshark of course, with some sample URLs.
Fundamentally, I'd take the same approach to this as with other programming tasks - write the code in the simplest way that works, and then check whether it performs well enough for your needs. If it does, you're done. If it doesn't, use appropriate tools (network analyzers etc) to work out why it's not performing well enough, and use more complicated code only if it fixes the problem.
My experience is that WebClient
is fine if it doesn't what you need - but it doesn't give you quite as much fine-grained control as WebRequest
. If you don't need that control, go with WebClient
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With