Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Copying Files over an Intermittent Network Connection

I am looking for a robust way to copy files over a Windows network share that is tolerant of intermittent connectivity. The application is often used on wireless, mobile workstations in large hospitals, and I'm assuming connectivity can be lost either momentarily or for several minutes at a time. The files involved are typically about 200KB - 500KB in size. The application is written in VB6 (ugh), but we frequently end up using Windows DLL calls.

Thanks!

like image 384
Eric Pohl Avatar asked Aug 20 '08 15:08

Eric Pohl


4 Answers

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

like image 76
Joel Spolsky Avatar answered Oct 08 '22 23:10

Joel Spolsky


I'm unclear as to what your actual problem is, so I'll throw out a few thoughts.

  • Do you want restartable copies (with such small file sizes, that doesn't seem like it'd be that big of a deal)? If so, look at CopyFileEx with COPYFILERESTARTABLE
  • Do you want verifiable copies? Sounds like you already have that by verifying hashes.
  • Do you want better performance? It's going to be tough, as it sounds like you can't run anything on the server. Otherwise, TransmitFile may help.
  • Do you just want a fire and forget operation? I suppose shelling out to robocopy, or TeraCopy or something would work - but it seems a bit hacky to me.
  • Do you want to know when the network comes back? IsNetworkAlive has your answer.

Based on what I know so far, I think the following pseudo-code would be my approach:

sourceFile = Compress("*.*");
destFile = "X:\files.zip";

int copyFlags = COPYFILEFAILIFEXISTS | COPYFILERESTARTABLE;
while (CopyFileEx(sourceFile, destFile, null, null, false, copyFlags) == 0) {
   do {
     // optionally, increment a failed counter to break out at some point
     Sleep(1000);
   while (!IsNetworkAlive(NETWORKALIVELAN));
}

Compressing the files first saves you the tracking of which files you've successfully copied, and which you need to restart. It should also make the copy go faster (smaller total file size, and larger single file size), at the expense of some CPU power on both sides. A simple batch file can decompress it on the server side.

like image 30
Mark Brackett Avatar answered Oct 08 '22 23:10

Mark Brackett


Try using BITS (Background Intelligent Transfer Service). It's the infrastructure that Windows Update uses, is accessible via the Win32 API, and is built specifically to address this.

It's usually used for application updates, but should work well in any file moving situation.

http://www.codeproject.com/KB/IP/bitsman.aspx

like image 5
TheSmurf Avatar answered Oct 09 '22 01:10

TheSmurf


I agree with Robocopy as a solution...thats why the utility is called "Robust File Copy"

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

And by default, a million retries. That should be plenty for your intermittent connection.

It also does restartable transfers and you can even throttle transfers with a gap between packets assuing you don't want to use all the bandwidth as other programs are using the same connection (/IPG switch)?.

like image 2
Robbo Avatar answered Oct 09 '22 01:10

Robbo