Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the fastest way to send large binary file from one pc to another pc over the Internet?

I need to send large binary(2Gb-10Gb) data from one pc(client) to another pc(server) over the Internet. First I tried to use WCF service hosted in IIS using wsHttpBinding binding with message security but it took a lot of time (a few days) which is a inappropriate for me. Now i think about writing client and server applications using sockets. Would it be faster?

What is the best way to do it?

Thanks

like image 304
Sergey Smelov Avatar asked Feb 14 '11 09:02

Sergey Smelov


4 Answers

The plain old FTP in order to me would be suitable in this case. By using it you will have the chance to recover an interrupted transfer without need to redo de job from start. You need to keep in account the possibility a so massive download get interrupted for some reasons.

like image 83
Felice Pollano Avatar answered Oct 07 '22 06:10

Felice Pollano


When sending large amounts of data, you are limited by the bandwidth of the connection. And you should take care of disruptions in the connection. Small disruptions can have a big impact if you have to resend a lot of data.

You can use BITS, this transfers the data in the background, and divides the data into blocks. So it will take care of a lot of stuff for you.

It depends on IIS (on the server), and has a client (API) to transfer the data. So you do not need to read or write the basics of the data transferring the data.

I don't know if it will be faster, but at least a lot more reliable as making a single HTTP or FTP request. And you can have it running very fast.

If bandwidth is a problem, and it doesn't have to be send over the internet, you could check out high-bandwidth/low-latency connections like sending a DVD by courier.

You can use BITS from .Net, on CodeProject there is wrapper.

like image 34
GvS Avatar answered Oct 07 '22 04:10

GvS


Well, the bandwidth is your problem, going even lower into sockets won't help you much there as WCF overhead doesn't play much with long binary responses. Maybe your option is to use some lossless streaming compression algorithm? Provided that your data is compressible (do a dry run using zip, if it shrinks a file on local disk you can find a suitable streaming algorithm). Btw, I would suggest providing a resume support :)

like image 29
mmix Avatar answered Oct 07 '22 04:10

mmix


Usually it's most appropriate to leverage something that's already been written for this type of thing. e.g. FTP, SCP, rsync etc

FTP supports resuming if the download broke, although not sure if it supports a resumed upload. Rsync is much better at this kind of thing.

EDIT: It might be worth considering something that I'm not terribly familiar with but might be another option - bit-torrent?

A further option is to roll your own client/server using a protocol library such as UDT which will give you better than TCP performance. See: http://udt.sourceforge.net/

like image 29
hookenz Avatar answered Oct 07 '22 06:10

hookenz