After I copied an object to the same bucket with a different key and prefix(It is similar to renaming, I believe), its public-read permission is removed.
import boto3
s3 = boto3.resource('s3')
copy_source = {
'Bucket': 'mybucket',
'Key': 'mykey'
}
s3.meta.client.copy(copy_source, 'otherbucket', 'otherkey')
When I tried to open the file on a browser using the link found on s3. And, I realized that on the permission tab, it doesn't have public-read permission while the original file has.
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<RequestId>***</RequestId>
<HostId>***</HostId>
</Error>
2 questions:
i) Is is possible to maintain the ACL permission when I use
copy(CopySource, Bucket, Key, ExtraArgs=None, Callback=None, SourceClient=None, Config=None)
ii) What are the ExtraArgs that I can set? The document doesn't mention anything.
This is not the exact answer that I want but it seems to work for now.
I am not sure how to maintain the permission but I can manually set it to public-read or other types that I need it to be.
This are the possible values from boto3 copy_object:
'private'|'public-read'|'public-read-write'|'authenticated-read'|'aws-exec-read'|'bucket-owner-read'|'bucket-owner-full-control'
import boto3
s3 = boto3.resource('s3')
copy_source = {
'Bucket': 'mybucket',
'Key': 'mykey'
}
extra_args = {
'ACL': 'public-read'
}
s3.meta.client.copy(copy_source, 'otherbucket', 'otherkey', extra_args)
Then, I realized it needs more permission to perform this action. I am not sure which permission it actually needs now but I get these errors:
i) An error occurred (AccessDenied) when calling the CopyObject operation: Access Denied.
ii) An error occurred (AccessDenied) when calling the CreateMultipartUpload operation: Access Denied.
It works after I added "s3:PutObjectAcl" policy.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With