I need to get all of the images from one website that are contained all in one folder. Like for instance, (site.com/images/.*). Is this possible? If so, whats the best way?
Do you have FTP access?
Do you have shell access?
With Linux it's pretty easy. Not sure about windows.
wget -H -r --level=1 -k -p http://example.com/path/to/images
Edit: Just found wget for windows.
Edit 2: I just saw the PHP tag, in order to create a PHP script which downloads all images in one go, you will have to create a zip (or equivalent) archive and send that with the correct headers. Here is how to zip a folder in php, it wouldn't be hard to extract only the images in that folder, just edit the code given to say something like:
foreach ($iterator as $key=>$value) {
if (!is_dir($key)) {
$file = basename($key);
list($name, $ext) = explode('.', $key);
switch ($ext) {
case "png":
case "gif":
case "jpg":
$zip->addFile(realpath($key), $key) or die ("ERROR: Could not add file: $key");
break;
}
}
}
Have a look at HTTrack software. It can download whole sites. Give website address site.com/images/
and it will download everything in this directory. (if the directory access is not restricted by owner)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With