Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

File Transfer Protocol options?

I am looking for a good way to transfer non-trivial (10G > x >10MB) amounts of data from one machine to another, potentially over multiple sessions.

I have looked briefly at

  • *ftp (sftp, tftp, ftp)
  • http
  • torrents (out because I will not have a seed network in general)
  • rsync (not sure if I can really adapt this to what I need)

Are there any other protocols out there that might fit the bill a little better? Most of the above are not very fault tolerant in and of themselves, but rather rely on client/server apps to pick up the slack. At this stage I care much more about the protocol itself, rather than a particular client/server implementation that works well.

(And yea I know I can write my own over udp, but I'd prefer almost anything else!!)

like image 336
JT. Avatar asked Dec 14 '22 06:12

JT.


2 Answers

I use rsync (over SSH) to transfer anything that I think might take more than a minute.

It's easy to rate-limit, suspend/resume and get progress reports. You can automate it with SSH keys. It's (usually) already installed (on *nix boxes, anyway).

Depending on what you need, rsync can probably adapt. If you're distributing to a lot of users, FTP/HTTP might be better for firewall concerns; but rsync is great for one-to-one or one-to-a-few transfers.

like image 54
Peter Stone Avatar answered Jan 13 '23 14:01

Peter Stone


rsync is almost always the best bet.

since it transfers only differences, if the transfer is interrupted, the next time it won't be so different as the first one (when there wasn't a file at destination)

like image 25
Javier Avatar answered Jan 13 '23 14:01

Javier