I have been trying to do batch upload using s3 client execute()
method. It keeps throwing exception:
Message: Argument 1 passed to Aws\AwsClient::execute() must implement interface Aws\CommandInterface, array given, called in....
Even after following the example code from the doc.
Check this example maybe I am doing something wrong:
$bucket = 'myBucket';
$commands = [];
$s3 = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-west-1',
]);
$commands[] = $s3->getCommand('PutObject', [
'Bucket' => $bucket,
'Key' => 'key1.gif',
'Body' => 'PATH_TO_FILE_1',
]);
$commands[] = $s3->getCommand('PutObject', [
'Bucket' => $bucket,
'Key' => 'key2.gif',
'Body' => 'PATH_TO_FILE_2',
]);
$s3->execute($commands);
Thanks in advance!
There are no limits to the number of files/folders you can place in a bucket. You are however limited to 100 buckets per account. 100 buckets per account. You can make as many accounts as you need, it would seem.
S3 batch operations in action: the implementation However, AWS informs us that it may take up to 48 hours to deliver the first report!
In the Amazon S3 console, choose the bucket where you want to upload an object, choose Upload, and then choose Add Files. In the file selection dialog box, find the file that you want to upload, choose it, choose Open, and then choose Start Upload. You can watch the progress of the upload in the Transfer pane.
Just solved this myself.
Instead of calling $s3->execute()
, pass the client and the $commands
array to a CommandPool like so:
use Aws\CommandPool;
// your code here
$results = CommandPool::batch($s3, $commands);
You'll receive an array of results, sorted in the same order as the commands. One of the differeneces is that this will also include Exception
objects if the command failed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With