Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to download the latest file of an S3 bucket using Boto3?

The other questions I could find were refering to an older version of Boto. I would like to download the latest file of an S3 bucket. In the documentation I found that there is a method list_object_versions() that gets you a boolean IsLatest. Unfortunately I only managed to set up a connection and to download a file. Could you please show me how I can extend my code to get the latest file of the bucket? Thank you

import boto3
conn = boto3.client('s3',
                    region_name="eu-west-1",
                    endpoint_url="customendpoint",
                    config=Config(signature_version="s3", s3={'addressing_style': 'path'}))

From here I dont know how to get the latest added file from a bucket called mytestbucket. There are various csv files in the bucket but all of course with a different name.

Update:

import boto3
from botocore.client import Config

s3 = boto3.resource('s3', region_name="eu-west-1", endpoint_url="custom endpoint", aws_access_key_id = '1234', aws_secret_access_key = '1234', config=Config(signature_version="s3", s3={'addressing_style': 'path'}))
my_bucket = s3.Bucket('mytestbucket22')
unsorted = []
for file in my_bucket.objects.filter():
   unsorted.append(file)

files = [obj.key for obj in sorted(unsorted, key=get_last_modified, reverse=True)][0:9]

This gives me the following error:

NameError: name 'get_last_modified' is not defined
like image 713
jz22 Avatar asked Jul 28 '17 14:07

jz22


4 Answers

Variation of the answer I provided for: Boto3 S3, sort bucket by last modified. You can modify the code to suit to your needs.

get_last_modified = lambda obj: int(obj['LastModified'].strftime('%s'))

s3 = boto3.client('s3')
objs = s3.list_objects_v2(Bucket='my_bucket')['Contents']
last_added = [obj['Key'] for obj in sorted(objs, key=get_last_modified)][0]

If you want to reverse the sort:

[obj['Key'] for obj in sorted(objs, key=get_last_modified, reverse=True)][0]
like image 194
helloV Avatar answered Sep 20 '22 14:09

helloV


You can do

import boto3

s3_client = boto3.client('s3')
response = s3_client.list_objects_v2(Bucket='bucket_name', Prefix='prefix')
all = response['Contents']        
latest = max(all, key=lambda x: x['LastModified'])
like image 32
smaraf Avatar answered Sep 18 '22 14:09

smaraf


This handles when there are more than 1000 objects in the s3 bucket. This is basically @SaadK answer without the for loop and using newer version for list_objects_v2.

EDIT: Fixes issue @Timothée-Jeannin identified. Ensures that latest across all pages is identified.

https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Paginator.ListObjectsV2

import boto3

def get_most_recent_s3_object(bucket_name, prefix):
    s3 = boto3.client('s3')
    paginator = s3.get_paginator( "list_objects_v2" )
    page_iterator = paginator.paginate(Bucket=bucket_name, Prefix=prefix)
    latest = None
    for page in page_iterator:
        if "Contents" in page:
            latest2 = max(page['Contents'], key=lambda x: x['LastModified'])
            if latest is None or latest2['LastModified'] > latest['LastModified']:
                latest = latest2
    return latest

latest = get_most_recent_s3_object(bucket_name, prefix)

latest['Key']  # -->   'prefix/objectname'
like image 38
marginal_dev Avatar answered Sep 18 '22 14:09

marginal_dev


If you have a lot of files then you'll need to use pagination as mentioned by helloV. This is how I did it.

get_last_modified = lambda obj: int(obj['LastModified'].strftime('%s'))
s3 = boto3.client('s3')
paginator = s3.get_paginator( "list_objects" )
page_iterator = paginator.paginate( Bucket = "BucketName", Prefix = "Prefix")
for page in page_iterator:
    if "Contents" in page:
        last_added = [obj['Key'] for obj in sorted( page["Contents"], key=get_last_modified)][-1]
like image 24
SaadK Avatar answered Sep 21 '22 14:09

SaadK