Can anyone help me for this one..??
I want to download all files from a folder which is inside my bucket's folder, to my computer's directory with the same name.
Let's say there is a bucket name "ABC" a folder is there inside it, is "DEF".. In which folder there are multiple files available..
Now I want to download it into my project folder "/opt/lampp/htdocs/porject/files/download/" here "DEF" folder is also available..
So, anyone can help me, and give me the code for this..?
Thanks in advance..
=============
ERROR :
Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::_construct() [recursivedirectoryiterator.--construct]: Unable to find the wrapper "s3" - did you forget to enable it when you configured PHP?' in /opt/lampp/htdocs/demo/amazon-s3/test.php:21 Stack trace: #0 /opt/lampp/htdocs/demo/amazon-s3/test.php(21): RecursiveDirectoryIterator->_construct('s3://bucketname/folder...') #1 {main} thrown in /opt/lampp/htdocs/demo/amazon-s3/test.php on line 21
Mark's answer is totally valid, but there is also an even easier way to do this with the AWS SDK for PHP using the downloadBucket()
method. Here's an example (assuming $client
is an instance of the S3 client):
$bucket = 'YOUR_BUCKET_NAME';
$directory = 'YOUR_FOLDER_OR_KEY_PREFIX_IN_S3';
$basePath = 'YOUR_LOCAL_PATH/';
$client->downloadBucket($basePath . $directory, $bucket, $directory);
The cool thing about this method is that it queues up only the files that don't already exist (or haven't been modified) in the local directory, and attempts to download them in parallel, in order speed up the overall download time. There is a 4th argument to the method (see the link) that includes other options like setting how many parallel downloads you want to happen at a time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With