Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Download all images from a single directory of a website

I need to get all of the images from one website that are contained all in one folder. Like for instance, (site.com/images/.*). Is this possible? If so, whats the best way?

like image 243
bryan sammon Avatar asked Jan 07 '11 02:01

bryan sammon


Video Answer


2 Answers

Do you have FTP access?

Do you have shell access?

With Linux it's pretty easy. Not sure about windows.

wget -H -r --level=1 -k -p http://example.com/path/to/images

Edit: Just found wget for windows.

Edit 2: I just saw the PHP tag, in order to create a PHP script which downloads all images in one go, you will have to create a zip (or equivalent) archive and send that with the correct headers. Here is how to zip a folder in php, it wouldn't be hard to extract only the images in that folder, just edit the code given to say something like:

foreach ($iterator as $key=>$value) {
    if (!is_dir($key)) {
        $file = basename($key);
        list($name, $ext) = explode('.', $key);
        switch ($ext) {
            case "png":
            case "gif":
            case "jpg":
                $zip->addFile(realpath($key), $key) or die ("ERROR: Could not add file: $key");
            break;
        }
    }
}
like image 169
rich97 Avatar answered Oct 09 '22 04:10

rich97


Have a look at HTTrack software. It can download whole sites. Give website address site.com/images/ and it will download everything in this directory. (if the directory access is not restricted by owner)

like image 31
Tasawer Khan Avatar answered Oct 09 '22 02:10

Tasawer Khan