I am a beginner in using Boto3 and I would like to transfer a file from an S3 bucket to am SFTP server directly.
My final goal is to write a Python script for AWS Glue.
I have found some article which shows how to transfer a file from an SFTP to an S3 bucket:
https://medium.com/better-programming/transfer-file-from-ftp-server-to-a-s3-bucket-using-python-7f9e51f44e35
Unfortunately I can't find anything which does the opposite action. Do you have any suggestions/ideas?
My first wrong attempt is below.
But I would like to avoid downloading while file to my local memory in order to move it then to SFTP.
import pysftp
import boto3
# get clients
s3_gl = boto3.client('s3', aws_access_key_id='', aws_secret_access_key='')
# parameters
bucket_gl = ''
gl_data = ''
gl_script = ''
source_response = s3_gl.get_object(Bucket=bucket_gl,Key=gl_script+'file.csv')
print(source_response['Body'].read().decode('utf-8'))
#---------------------------------
srv = pysftp.Connection(host="", username="", password="")
with srv.cd('relevant folder in sftp'):
srv.put(source_response['Body'].read().decode('utf-8'))
# Closes the connection
srv.close()
To create a Managed SFTP server for S3, in your Amazon AWS Console, go to AWS Transfer for SFTP and create a new server (you can keep server options to their defaults for a start). In SFTP server page, add a new SFTP user (or users). Permissions of users are governed by an associated AWS role in IAM service.
First, we’ll set up an FTP server in AWS Transfer Family and transfer internally, and then we’ll use either a client like FileZilla or a third-party service such as Files.com to synchronize with the AWS S3 bucket. 1. Storage Units: FTP/SFTP vs. Amazon S3 Buckets
Now, we import the Boto3 library provided by AWS, and then we create client object named as s3_client. Now pass the name of local file (we want to upload on the S3 server), the name of the bucket created on the S3 server, and the object name that will be used to keep the file on the server.
AWS Transfer Family is a fully managed AWS service that enables you to transfer files to and from Amazon S3 buckets. AWS Transfer Family uses SSH, SFTP, FTPS, and FTP protocol to transfer files over the internet. Files.com is an extremely fast, reliable, and enterprise-ready file server solution that allows users to upload and share files.
SFTP uses an SSH protocol for the secure transfer of files over the internet. AWS Transfer Family is a fully managed AWS service that enables you to transfer files to and from Amazon S3 buckets. AWS Transfer Family uses SSH, SFTP, FTPS, and FTP protocol to transfer files over the internet.
"transfer ... directly" can mean number of different things.
Let's assume that you want to transfer the file via the local machine (where the Python code runs), without actually storing a temporary copy of the file to the local file system.
For SFTP upload, you can use Paramiko library.
Assuming you already have your Paramiko SFTPClient
(sftp
) and Boto 3 client
(s3
) instances ready (what is covered in the article you have linked in your question), you can simply "glue" them together using file-like objects:
with sftp.open('/sftp/path/filename', 'wb', 32768) as f:
s3.download_fileobj('mybucket', 'mykey', f)
For the purpose of the 32768
argument, see Writing to a file on SFTP server opened using pysftp "open" method is slow.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With