Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to copy a file to a remote server in Python using SCP or SSH?

People also ask

How do I use SCP in python?

We need to generate the SSH key on the source and install it on the destination beforehand to authenticate SCP with your key. The subprocess. run() function was introduced in Python 3.5. We can also use other functions to run the SCP bash command like the subprocess.

How copy SSH file from local to remote?

Copy all files from local to remote using scp. Copy all files and folders recursively from local to remote using scp. remoteuser need to exist and have write permission to /remote/folder/ in the remote system. GUI programs such WinSCP can also be used to transfer files between local and remote host using scp methods.

How do I copy a file to a remote server?

You can use SecureShell (SSH) or Remote Sync (Rsync) to transfer files to a remote server. Secure Copy (SCP) uses SSH to copy only the files or directories that you select. On first use, Rsync copies all files and directories and then it copies only the files and directories that you have changed.

How do I copy a file from one server to another in python?

The easiest way to copy files from one server to another over ssh is to use the scp command. For calling scp you'd need the subprocess module.


To do this in Python (i.e. not wrapping scp through subprocess.Popen or similar) with the Paramiko library, you would do something like this:

import os
import paramiko

ssh = paramiko.SSHClient() 
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, password=password)
sftp = ssh.open_sftp()
sftp.put(localpath, remotepath)
sftp.close()
ssh.close()

(You would probably want to deal with unknown hosts, errors, creating any directories necessary, and so on).


You can call the scp bash command (it copies files over SSH) with subprocess.run:

import subprocess
subprocess.run(["scp", FILE, "USER@SERVER:PATH"])
#e.g. subprocess.run(["scp", "foo.bar", "[email protected]:/path/to/foo.bar"])

If you're creating the file that you want to send in the same Python program, you'll want to call subprocess.run command outside the with block you're using to open the file (or call .close() on the file first if you're not using a with block), so you know it's flushed to disk from Python.

You need to generate (on the source machine) and install (on the destination machine) an ssh key beforehand so that the scp automatically gets authenticated with your public ssh key (in other words, so your script doesn't ask for a password).


You'd probably use the subprocess module. Something like this:

import subprocess
p = subprocess.Popen(["scp", myfile, destination])
sts = os.waitpid(p.pid, 0)

Where destination is probably of the form user@remotehost:remotepath. Thanks to @Charles Duffy for pointing out the weakness in my original answer, which used a single string argument to specify the scp operation shell=True - that wouldn't handle whitespace in paths.

The module documentation has examples of error checking that you may want to perform in conjunction with this operation.

Ensure that you've set up proper credentials so that you can perform an unattended, passwordless scp between the machines. There is a stackoverflow question for this already.


There are a couple of different ways to approach the problem:

  1. Wrap command-line programs
  2. use a Python library that provides SSH capabilities (eg - Paramiko or Twisted Conch)

Each approach has its own quirks. You will need to setup SSH keys to enable password-less logins if you are wrapping system commands like "ssh", "scp" or "rsync." You can embed a password in a script using Paramiko or some other library, but you might find the lack of documentation frustrating, especially if you are not familiar with the basics of the SSH connection (eg - key exchanges, agents, etc). It probably goes without saying that SSH keys are almost always a better idea than passwords for this sort of stuff.

NOTE: its hard to beat rsync if you plan on transferring files via SSH, especially if the alternative is plain old scp.

I've used Paramiko with an eye towards replacing system calls but found myself drawn back to the wrapped commands due to their ease of use and immediate familiarity. You might be different. I gave Conch the once-over some time ago but it didn't appeal to me.

If opting for the system-call path, Python offers an array of options such as os.system or the commands/subprocess modules. I'd go with the subprocess module if using version 2.4+.


Reached the same problem, but instead of "hacking" or emulating command line:

Found this answer here.

from paramiko import SSHClient
from scp import SCPClient

ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect('example.com')

with SCPClient(ssh.get_transport()) as scp:
    scp.put('test.txt', 'test2.txt')
    scp.get('test2.txt')