Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Boto3 S3: Get files without getting folders

Using boto3, how can I retrieve all files in my S3 bucket without retrieving the folders?

Consider the following file structure:

file_1.txt
folder_1/
    file_2.txt
    file_3.txt
    folder_2/
        folder_3/
            file_4.txt

In this example Im only interested in the 4 files.

EDIT:

A manual solution is:

def count_files_in_folder(prefix):
    total = 0
    keys = s3_client.list_objects(Bucket=bucket_name, Prefix=prefix)
    for key in keys['Contents']:
        if key['Key'][-1:] != '/':
            total += 1
    return total

In this case total would be 4.

If I just did

count = len(s3_client.list_objects(Bucket=bucket_name, Prefix=prefix))

the result would be 7 objects (4 files and 3 folders):

file.txt
folder_1/
folder_1/file_2.txt
folder_1/file_3.txt
folder_1/folder_2/
folder_1/folder_2/folder_3/
folder_1/folder_2/folder_3/file_4.txt

I JUST want:

file.txt
folder_1/file_2.txt
folder_1/file_3.txt  
folder_1/folder_2/folder_3/file_4.txt
like image 439
Vingtoft Avatar asked Mar 08 '17 14:03

Vingtoft


People also ask

Can I read S3 file without downloading?

Reading objects without downloading them Similarly, if you want to upload and read small pieces of textual data such as quotes, tweets, or news articles, you can do that using the S3 resource method put(), as demonstrated in the example below (Gist).

How to get a list of files from S3 bucket using boto3?

The most convenient method to get a list of files from S3 Bucket using Boto3 is to use the S3Bucket.objects.all () method: Here’s an example output: If you need to get a list of S3 objects which keys are starting from the specific prefix, you can use the .filter () method to do this:

How to create a boto3 S3 session?

Create Boto3 session using boto3.session () method Create the boto3 s3 client using the boto3.client ('s3') method. Invoke the list_objects_v2 () method with the bucket name to list all the objects in the S3 bucket. It returns the dictionary object with the object details.

How to rename an S3 object using boto3?

There’s no single API call to rename an S3 object. So, to rename an S3 object, you need to copy it to a new object with a new name and then deleted the old object: Here’s an execution result: How to copy file objects between S3 buckets using Boto3? To copy file objects between S3 buckets using Boto3, you can use the copy_from () method.

How do I delete a non empty bucket in boto3?

Deleting non-empty S3 Bucket using Boto3 To delete an S3 Bucket using the Boto3 library, you have to clean up the S3 Bucket. Otherwise, the Boto3 library will raise the BucketNotEmpty exception. The cleanup operation requires deleting all S3 Bucket objects and their versions:


2 Answers

There are no folders in S3. What you have is four files named:

file_1.txt
folder_1/file_2.txt
folder_1/file_3.txt
folder_1/folder_2/folder_3/file_4.txt

Those are the actual names of the objects in S3. If what you want is to end up with:

file_1.txt
file_2.txt
file_3.txt
file_4.txt

all sitting in the same directory on a local file system you would need to manipulate the name of the object to strip out just the file name. Something like this would work:

import os.path

full_name = 'folder_1/folder_2/folder_3/file_4.txt'
file_name = os.path.basename(full_name)

The variable file_name would then contain 'file_4.txt'.

like image 52
garnaat Avatar answered Sep 20 '22 17:09

garnaat


S3 is an OBJECT STORE. It DOES NOT store file/object under directories tree. New comer always confuse the "folder" option given by them, which in fact an arbitrary prefix for the object.

object PREFIX is a way to retrieve your object organised by predefined fix file name(key) prefix structure, e.g. .

You can imagine using a file system that don't allow you to create a directory, but allow you to create file name with a slash "/" or backslash "\" as delimiter, and you can denote "level" of the file by a common prefix.

Thus in S3, you can use following to "simulate directory" that is not a directory.

folder1-folder2-folder3-myobject
folder1/folder2/folder3/myobject
folder1\folder2\folder3\myobject

As you can see, object name can store inside S3 regardless what kind of arbitrary folder separator(delimiter) you use.

However, to help user to make bulks file transfer to S3, tools such as aws cli, s3_transfer api attempt to simplify the step and create object name follow your input local folder structure.

So if you are sure that all the S3 object is using / or \ as separator , you can use tools like S3transfer or AWSCcli to make a simple download by using the key name.

Here is the quick and dirty code using the resource iterator. Using s3.resource.object.filter will return iterator that doesn't have same 1000 keys limit as list_objects()/list_objects_v2().

import os 
import boto3
s3 = boto3.resource('s3')
mybucket = s3.Bucket("mybucket")
# if blank prefix is given, return everything)
bucket_prefix="/some/prefix/here"
objs = mybucket.objects.filter(
    Prefix = bucket_prefix)

for obj in objs:
    path, filename = os.path.split(obj.key)
    # boto3 s3 download_file will throw exception if folder not exists
    try:
        os.makedirs(path) 
    except FileExistsError:
        pass
    mybucket.download_file(obj.key, obj.key)
like image 43
mootmoot Avatar answered Sep 22 '22 17:09

mootmoot