Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How could I use aws lambda to write file to s3 (python)?

I have tried to use lambda function to write a file to S3, then test shows "succeeded" ,but nothing appeared in my S3 bucket. What happened? Does anyone can give me some advice or solutions? Thanks a lot. Here's my code.

import json import boto3  def lambda_handler(event, context):  string = "dfghj"  file_name = "hello.txt" lambda_path = "/tmp/" + file_name s3_path = "/100001/20180223/" + file_name  with open(lambda_path, 'w+') as file:     file.write(string)     file.close()  s3 = boto3.resource('s3') s3.meta.client.upload_file(lambda_path, 's3bucket', s3_path) 
like image 638
Rick.Wang Avatar asked Feb 23 '18 10:02

Rick.Wang


People also ask

Can Lambda create S3 bucket?

Create an Amazon S3 bucket and upload a test file to your new bucket. Your Lambda function retrieves information about this file when you test the function from the console. Open the Amazon S3 console . Choose Create bucket.


2 Answers

I've had success streaming data to S3, it has to be encoded to do this:

import boto3  def lambda_handler(event, context):     string = "dfghj"     encoded_string = string.encode("utf-8")      bucket_name = "s3bucket"     file_name = "hello.txt"     s3_path = "100001/20180223/" + file_name      s3 = boto3.resource("s3")     s3.Bucket(bucket_name).put_object(Key=s3_path, Body=encoded_string) 

If the data is in a file, you can read this file and send it up:

with open(filename) as f:     string = f.read()  encoded_string = string.encode("utf-8") 
like image 107
Tim B Avatar answered Oct 11 '22 15:10

Tim B


My response is very similar to Tim B but the most import part is

1.Go to S3 bucket and create a bucket you want to write to

2.Follow the below steps otherwise you lambda will fail due to permission/access. I've copied and pasted it the link content here for you too just in case if they change the url /move it to some other page.

a. Open the roles page in the IAM console.

b. Choose Create role.

c. Create a role with the following properties.

-Trusted entity – AWS Lambda.

-Permissions – AWSLambdaExecute.

-Role name – lambda-s3-role.

The AWSLambdaExecute policy has the permissions that the function needs to manage objects in Amazon S3 and write logs to CloudWatch Logs.

  1. Copy and past this into your Lambda python function

    import json, boto3,os, sys, uuid from urllib.parse import unquote_plus  s3_client = boto3.client('s3')  def lambda_handler(event, context):     some_text = "test"     #put the bucket name you create in step 1     bucket_name = "my_buck_name"     file_name = "my_test_file.csv"     lambda_path = "/tmp/" + file_name     s3_path = "output/" + file_name     os.system('echo testing... >'+lambda_path)     s3 = boto3.resource("s3")     s3.meta.client.upload_file(lambda_path, bucket_name, file_name)      return {         'statusCode': 200,         'body': json.dumps('file is created in:'+s3_path)     } 
like image 37
grepit Avatar answered Oct 11 '22 15:10

grepit