I'm writing a django backend for an application in which the client will upload a video file to s3. I want to use presigned urls, so the django server will sign a url and pass it back to the client, who will then upload their video to s3. The problem is, the generate_presigned_url method does not seem to know about the s3 client upload_file method...
Following this example, I use the following code to generate the url for upload:
s3_client = boto3.client('s3')
try:
s3_object_name = str(uuid4()) + file_extension
params = {
"file_name": local_filename,
"bucket": settings.VIDEO_UPLOAD_BUCKET_NAME,
"object_name": s3_object_name,
}
response = s3_client.generate_presigned_url(ClientMethod="upload_file",
Params=params,
ExpiresIn=500)
except ClientError as e:
logging.error(e)
return HttpResponse(503, reason="Could not retrieve upload url.")
When running it I get the error:
File "/Users/bridgedudley/.local/share/virtualenvs/ShoMe/lib/python3.6/site-packages/botocore/signers.py", line 574, in generate_presigned_url operation_name = self._PY_TO_OP_NAME[client_method] KeyError: 'upload_file'
which triggers the exception:
botocore.exceptions.UnknownClientMethodError: Client does not have method: upload_file
Afer debugging I found that the self._PY_TO_OP_NAME dictionary only contains a subset of the s3 client commands offered here:
scrolling down to "upload"...
No upload_file method! I tried the same code using "list_buckets" and it worked perfectly, giving me a presigned url that listed the buckets under the signer's credentials.
So without the upload_file method available in the generate_presigned_url function, how can I achieve my desired functionality?
Thanks!
In addition to the already mentioned usage of:
boto3.client('s3').generate_presigned_url('put_object', Params={'Bucket':'your-bucket-name', 'Key':'your-object-name'})
You can also use:
boto3.client('s3').generate_presigned_post('your-bucket_name', 'your-object_name')
Reference: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-presigned-urls.html#generating-a-presigned-url-to-upload-a-file
import boto3
bucket_name = 'my-bucket'
key_name = 'any-name.txt'
s3_client = boto3.client('s3')
upload_details = s3_client.generate_presigned_post(bucket_name, key_name)
print(upload_details)
Output:
{'url': 'https://my-bucket.s3.amazonaws.com/', 'fields': {'key': 'any-name.txt', 'AWSAccessKeyId': 'QWERTYUOP123', 'x-amz-security-token': 'a1s2d3f4g5h6j7k8l9', 'policy': 'z0x9c8v7b6n5m4', 'signature': 'qaz123wsx456edc'}}
import requests
filename_to_upload = './some-file.txt'
with open(filename_to_upload, 'rb') as file_to_upload:
files = {'file': (filename_to_upload, file_to_upload)}
upload_response = requests.post(upload_details['url'], data=upload_details['fields'], files=files)
print(f"Upload response: {upload_response.status_code}")
Output:
Upload response: 204
As documented:
The credentials used by the presigned URL are those of the AWS user who generated the URL.
Thus, make sure that the entity that would execute this generation of a presigned URL allows the policy s3:PutObject
to be able to upload a file to S3 using the signed URL. Once created, it can be configured through different ways. Some of them are:
As an allowed policy for a Lambda function
Or through boto3:
s3_client = boto3.client('s3',
aws_access_key_id="your-access-key-id",
aws_secret_access_key="your-secret-access-key",
aws_session_token="your-session-token", # Only for credentials that has it
)
Or on the working environment:
# Run in the Linux environment
export AWS_ACCESS_KEY_ID="your-access-key-id"
export AWS_SECRET_ACCESS_KEY="your-secret-access-key"
export AWS_SESSION_TOKEN="your-session-token", # Only for credentials that has it
Or through libraries e.g. django-storages for Django
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With