Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Uploading multiple files to Amazon S3 from PHP

Is there a way to upload multiple files in one go, rather than having to reconnect for each one?

I am using S3 as storage for my php application, which needs to store large numbers (100 at a time) of mostly small (about 10k) image files. Currently I am looping through them and uploading individually for each with this code:

$s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ)

This takes a LONG time. About a minute for 1.5 meg of files. Turning off SSL as has been suggested in other answers reduces to about 40s but that's still very slow.

Here is my current code, using Amazon S3 REST implementation for PHP

$s3 = new S3($awsAccessKey, $awsSecretKey, false);


function send_to_s3($s3, $bucketName, $uploadFile)

{

    $start = microtime(true);



// Check if our upload file exists
if (!file_exists($uploadFile) || !is_file($uploadFile))
    exit("\nERROR: No such file: $uploadFile\n\n");

// Check for CURL
if (!extension_loaded('curl') && !@dl(PHP_SHLIB_SUFFIX == 'so' ? 'curl.so' : 'php_curl.dll'))
    exit("\nERROR: CURL extension not loaded\n\n");



if ($s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ)) 
    {

    $end = microtime(true);

    $took = $end - $start;

    echo "S3::putObjectFile(): File copied to {$bucketName}/".($uploadFile).PHP_EOL . ' - ' . filesize($uploadFile) . ' in ' . $took . ' seconds<br />';

    return $took;
    }
else
    {
    print 'error';  
    }

}

appreciate any help.

like image 460
Jonathan Plackett Avatar asked Jan 11 '15 14:01

Jonathan Plackett


1 Answers

use Aws\S3\S3Client;    
use Aws\CommandPool;    
use Guzzle\Service\Exception\CommandTransferException;    

$commands = array();    
foreach ( $objects as $key => $file ) { 
    $fileContent = $file['body'];
        $objParams = array (    
            'ACL' => 'bucket-owner-full-control',    
            'Bucket' => 'bucket_name',    
            'Key' => 's3_path',    
            'Body' => $fileContent     
        );    
        $commands[] = $clientS3->getCommand('PutObject', $objParams);    
    }    
    try {    
        $results = CommandPool::batch($clientS3, $commands);    
    } catch (CommandTransferException $e) {    
        $succeeded = $e->getSuccessfulCommands();    
        echo "Failed Commands:\n";    
        foreach ($e->getFailedCommands() as $failedCommand) {    
            echo $e->getExceptionForFailedCommand($failedCommand)->getMessage() . "\n";    
        }    
    }    
like image 149
harsh tibrewal Avatar answered Oct 11 '22 01:10

harsh tibrewal