I'm trying to write Python log files directly to S3 without first saving them to stdout. I want the log files to be written to S3 automatically when the program is done running. I'd like to use the boto3 put_object
method:
import atexit
import logging
import boto3
def write_logs(body, bucket, key):
s3 = boto3.client("s3")
s3.put_object(Body=body, Bucket=bucket, Key=key)
log = logging.getLogger("some_log_name")
log.info("Hello S3")
atexit.register(write_logs, body=log, bucket="bucket_name", key="key_name")
quit()
This throws an error when uploading to S3. If i remember correctly, it requires that the object uploaded to S3 must be bytes-like. I'll update the question with the exact error once I have time to recreate the issue.
If an object already exists in a bucket, the new object will overwrite it because Amazon S3 stores the last write request.
You need to add a couple things here. First, create a StringIO object. Then, write the logs to the StringIO object using a logging StreamHandler. Add the handler to your logger. Finally, call the getvalue()
method on the StringIO object. You can write that to S3.
import atexit
import io
import logging
import boto3
def write_logs(body, bucket, key):
s3 = boto3.client("s3")
s3.put_object(Body=body, Bucket=bucket, Key=key)
log = logging.getLogger("some_log_name")
log_stringio = io.StringIO()
handler = logging.StreamHandler(log_stringio)
log.addHandler(handler)
atexit.register(write_logs, body=log_stringio.getvalue(), bucket="bucket_name", key="key_name")
log.info("Hello S3")
quit()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With