Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Download 3000+ Images Using C#?

I have a list of around 3000 image URLs, where I need to download them to my desktop.

I'm a web dev, so naturally wrote a little asp.net c# download method to do this but the obvious problem happened and the page timed out before I got hardly any of them.

Was wondering if anyone else knew of a good, quick and robust way of me looping through all the image URL's and downloading them to a folder? Open to any suggestions, WinForms, batch file although I'm a novice at both.

Any help greatly appreciated

like image 854
YodasMyDad Avatar asked Dec 14 '10 18:12

YodasMyDad


5 Answers

What about wget? It can download a list of URL specified in a file.

wget -i c:\list-of-urls.txt
like image 63
Jorge Ferreira Avatar answered Nov 13 '22 11:11

Jorge Ferreira


Write a C# command-line application (or Winforms, if that's your inclination), and use the WebClient class to retrieve the files.

Here are some tutorials:

C# WebClient Tutorial

Using WebClient to Download a File

or, just Google C# WebClient.

You'll either need to provide a list of files to download and loop through the list, issuing a request for each file and saving the result, or issue a request for the index page, parse it using something like HTML Agility Pack to find all of the image tags, and then issue a request for each image, saving the result somewhere on your local drive.

Edit

If you just want to do this once (as in, not as part of an application), mbeckish's answer makes the most sense.

like image 34
3Dave Avatar answered Nov 13 '22 11:11

3Dave


You might want to use an existing download manager like Orbit, rather than writing your own program for the purpose. (blasphemy, I know)

I've been pretty happy with Orbit. It lets you import a list of downloads from a text file. It'll manage the connections, downloading portions of each file in parallel with multiple connections, to increase the speed of each download. It'll take care of retrying if connections time out, etc. It seems like you'd have to go to a lot of effort to build these kind of features from scratch.

like image 39
StriplingWarrior Avatar answered Nov 13 '22 12:11

StriplingWarrior


If this is just a one-time job, then one easy solution would be to write a HTML page with img tags pointing to the URLs.

Then browse it with FireFox and use an extension to save all of the images to a folder.

like image 6
mbeckish Avatar answered Nov 13 '22 11:11

mbeckish


Working on the assumption that this is a one off run once project and as you are a novice with other technologies I would suggest the following:

Rather than try and download all 3000 images in one web request do one image per request. When the image download is complete redirect to the same page passing the URL of the next image to get as a query string parameter. Download that one and then repeat until all images are downloaded.

Not what I would call a "production" solution, but if my assumption is correct it is a solution that will have you up an running in no time.

Another fairly simple solution would be to create a simple C# console application that uses WebClient to download each of the images. The following psuedo code should give you enough to get going:

List<string> imageUrls = new List<string>();
imageUrls.Add(..... your urls from wherever .....)

foreach(string imageUrl in imagesUrls)
{
    using (WebClient client = new WebClient())
    {
        byte[] raw = client.DownloadData(imageUrl);

        .. write raw .. to file
    }
}
like image 1
MrEyes Avatar answered Nov 13 '22 13:11

MrEyes