I'm trying to upload a file to my S3 bucket through Chalice (I'm playing around with it currently, still new to this). However, I can't seem to get it right.
I have AWS setup correctly, doing the tutorial successfully returns me some messages. Then I try to do some upload/download, and problem shows up.
s3 = boto3.resource('s3', region_name=<some region name, in this case oregon>)
BUCKET= 'mybucket'
UPLOAD_FOLDER = os.path.abspath('') # the file I wanna upload is in the same folder as my app.py, so I simply get the current folder name
@app.route('/upload/{file_name}', methods=['PUT'])
def upload_to_s3(file_name):
s3.meta.client.upload_file(UPLOAD_FOLDER+file_name, BUCKET, file_name)
return Response(message='upload successful',
status_code=200,
headers={'Content-Type': 'text/plain'}
)
Please don't worry about how I set my file path, unless that's the issue, of course.
I got the error log:
No such file or directory: ''
in this case file_name
is just mypic.jpg
.
I'm wondering why the UPLOAD_FOLDER
part is not being picked up. Also, for the reference, it seems like using absolute path will be troublesome with Chalice (while testing, I've seen the code being moved to /var/task/
)
Does anyone know how to set it up correctly?
EDIT:
the complete script
from chalice import Chalice, Response
import boto3
app = Chalice(app_name='helloworld') # I'm just modifying the script I used for the tutorial
s3 = boto3.client('s3', region_name='us-west-2')
BUCKET = 'chalicetest1'
@app.route('/')
def index():
return {'status_code': 200,
'message': 'welcome to test API'}
@app.route('/upload/{file_name}, methods=['PUT'], content_types=['application/octet-stream'])
def upload_to_s3(file_name):
try:
body = app.current_request.raw_body
temp_file = '/tmp/' + file_name
with open(temp_file, 'wb') as f:
f.write(body)
s3.upload_file(temp_file, BUCKET, file_name)
return Response(message='upload successful',
headers=['Content-Type': 'text/plain'],
status_code=200)
except Exception, e:
app.log.error('error occurred during upload %s' % e)
return Response(message='upload failed',
headers=['Content-Type': 'text/plain'],
status_code=400)
I got it running and this works for me as app.py
in an AWS Chalice project:
from chalice import Chalice, Response
import boto3
app = Chalice(app_name='helloworld')
BUCKET = 'mybucket' # bucket name
s3_client = boto3.client('s3')
@app.route('/upload/{file_name}', methods=['PUT'],
content_types=['application/octet-stream'])
def upload_to_s3(file_name):
# get raw body of PUT request
body = app.current_request.raw_body
# write body to tmp file
tmp_file_name = '/tmp/' + file_name
with open(tmp_file_name, 'wb') as tmp_file:
tmp_file.write(body)
# upload tmp file to s3 bucket
s3_client.upload_file(tmp_file_name, BUCKET, file_name)
return Response(body='upload successful: {}'.format(file_name),
status_code=200,
headers={'Content-Type': 'text/plain'})
You can test this with curl and its --upload-file
directly from the command line with:
curl -X PUT https://YOUR_API_URL_HERE/upload/mypic.jpg --upload-file mypic.jpg --header "Content-Type:application/octet-stream"
To get this running, you have to manually attach the policy to write to s3 to the role of your lambda function. This role is auto-generated by Chalice. Attach the policy (e.g. AmazonS3FullAccess
) manually next to the existing policy in the AWS IAM web interface to the role created by your Chalice project.
Things to mention:
/var/task/
of the Lambda functions, but you have some space at /tmp/
, see this answer.'application/octet-stream'
for the @app.route
(and upload the file accordingly via curl
).If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With