I'm trying to tranfer a multi-gb file from one server to another; problem is RTT is 150ms+. Ive already tried using aria2 but its limited to 16 connections, lftp doesn't have any protection against stalled transfers.
I'm wondering if its possible to download one file with multiple connections using curl cli.
It's possible. Fetch the total file size with -I
option in curl.
Then you can fork many process in a shell, every curl
connection with a different Content-Length
header to download different part of the file.
After all the tasks finish, then merge all the download slices to a big file.
I have written a simple script and it's available here mcurl.sh, with -s
option you can specify how many tasks you create to download the big file.
No, the curl tool has no such ability built-in. To do it with curl, you need to invoke it multiple times with range downloads.
(oh and btw, a large RTT is very rarely the explanation to why a plain TCP transfer is slow)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With