I have to move a folder from my local to s3. I would like to know if there is a way to do so.
My folder contains nested sub-folders with files inside (generally .pdf or .doc or docx).
I know I can move a single file from local to s3 by using s3fs
(https://s3fs.readthedocs.io/en/latest/api.html):
S3FileSystem.put(filename, path, **kwargs) Stream data from local filename to file at path
My code looks like
def upload_data(filepath, file_name):
s3 = s3fs.S3FileSystem()
s3_path = f"name-of-my-bucket/{file_name}"
s3.put(filepath, s3_path)
HOWEVER this allows me to upload a single file. I want to send a whole folder.
I can do it recursively, adding each file one by one, but:
1) I think that it would be easier if I could send the folder.
2) It would be harder to mantain the structure of the folder. Meaning that my local file folders/subfolders/myfile.pdf
is going to be saved in s3 as mypdf.pdf
rather than folders/subfolders/myfile.pdf
I know this is fairly old, and possibly this functionality didn't exist at the time of asking, but using s3fs
you can simply just set recursive=True
def upload_data(filepath, file_name):
s3 = s3fs.S3FileSystem()
s3_path = f"name-of-my-bucket/{file_name}"
s3.put(filepath, s3_path, recursive=True)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With