Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to unlock the file after AWS S3 Helper uploading file?

I am using the official PHP SDK with the official service provider for laravel to upload an image to Amazon S3. The image is temporarily stored on my server and should be deleted after uploading. The following is the code I used to do my upload and delete.

$temp_path = "/screenshot_temp/testing.png";

$client = AWS::createClient('s3');
$result = $client->putObject(array(
        'Bucket'     => self::$bucketName,
        'Key'        => 'screenshot/testing.png',
        'SourceFile' => $temp_path,
        'ACL'    => 'public-read'
    ));
);

chown($temp_path, 777);
unlink($temp_path);

The upload is successful. I can see my image with the link return, and I can see it on the amazon console. The problem is that the delete fails, with the following error message:

ErrorException: unlink(... path of my file ...): Permission denied

I am sure my file permission setting is correct, and I am able to delete my file with the section of code for uploading to S3 comment out. So it should be the problem that the file is locked during uploading the file. Is there a way I can unlock and delete my file?

like image 788
cytsunny Avatar asked Dec 30 '16 11:12

cytsunny


People also ask

Does S3 have file locking?

With S3 Object Lock, you can store objects using a write-once-read-many (WORM) model. Object Lock can help prevent objects from being deleted or overwritten for a fixed amount of time or indefinitely.

How do I view uploaded files on AWS S3?

In AWS Explorer, expand the Amazon S3 node, and double-click a bucket or open the context (right-click) menu for the bucket and choose Browse. In the Browse view of your bucket, choose Upload File or Upload Folder. In the File-Open dialog box, navigate to the files to upload, choose them, and then choose Open.

How do I regain access to S3 bucket?

Resolution. To get access to your bucket again, sign in to the Amazon S3 console as the AWS account root user, and then delete the bucket policy. Warning: Don't use the root user for everyday tasks. Limit the use of these credentials to only the tasks that require you to sign in as the root user.


2 Answers

Yes the stream upload locks the file till it finishes, Try either of 2,

$client = AWS::createClient('s3');
$fileContent = file_get_contents($temp_path);
$result = $client->putObject(array(
    'Bucket'     => self::$bucketName,
    'Key'        => 'screenshot/testing.png',
    'Body'       => $fileContent,
    'ACL'        => 'public-read'
));
);

unlink($temp_path);

or

$client = AWS::createClient('s3');
$fileContent = file_get_contents($temp_path);
$result = $client->putObject(array(
    'Bucket'     => self::$bucketName,
    'Key'        => 'screenshot/testing.png',
    'Body'       => $fileContent,
    'ACL'        => 'public-read'
));
);

gc_collect_cycles();
unlink($temp_path);
like image 53
Vineesh Avatar answered Oct 03 '22 14:10

Vineesh


When you're using SourceFile option at putObject S3Client opens a file, but doesn't close it after operation.

In most cases you just can unset $client and/or $result to close opened files. But unfortunately not in this case.

Use Body option instead of the SourceFile.

// temp file
$file = fopen($temp_path, "r");

// use resource, not a path
$result = $client->putObject(array(
        'Bucket'     => self::$bucketName,
        'Key'        => 'screenshot/testing.png',
        'Body'       => $file,
        'ACL'        => 'public-read'
    ));
);

fclose($file);

unlink($temp_path);
like image 37
Leonid Shumakov Avatar answered Oct 03 '22 14:10

Leonid Shumakov