Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I unlock a FileStream lock?

I am implementing a module to upload files in chunks from a client machine to a server. At the server side I am using a WCF Soap service.

In order to upload files in chunks, I have implemented this sample from Microsoft http://msdn.microsoft.com/en-us/library/aa717050.aspx. I have been able to get it working in a simple scenario, so it does upload the files in chunks. This chunking module is using a WSDualHttpBinding.

I need to implement a feature to re-upload a file in case the upload process is stopped for any reason (user choice, machine turned off, etc) while that specific file is being uploaded.

At my WCF service I have a method that handles the file writing at the server side:

public void UploadFile(RemoteFileInfo request)
{
FileInfo fi = new FileInfo(Path.Combine(Utils.StorePath(), request.FileName));

if (!fi.Directory.Exists)
{
    fi.Directory.Create();
}

FileStream file = new FileStream(fi.FullName, FileMode.Create, FileAccess.Write);
int count;
byte[] buffer = new byte[4096];
while ((count = request.FileByteStream.Read(buffer, 0, buffer.Length)) > 0)
{
    file.Write(buffer, 0, count);
    file.Flush();
}
request.FileByteStream.Close();
file.Position = 0;
file.Close();

if (request.FileByteStream != null)
{
    request.FileByteStream.Close();
    request.FileByteStream.Dispose();
}
}

The chunking module is sending chunks while the function request.FileByteStream.Read(buffer, 0, buffer.Length) is being consumed.

After the filestream is initialized, then the file gets locked (this is the normal behavior when initializing a filestream for writing), but the problem I have is that I stop the upload process while the send/receive process is being performed, then the channel used by the chunking module is not cancelled, so the file keeps locked since the WCF service is still waiting for more data to be sent, until the Send Timeout expires (timeout is 1 hr since I need to upload files +2.5GB). At the next upload, if I try to upload the same file, I get an exception at the WCF service because the filestream cannot be initialized again for the same file.

I would like to know if there is a way to avoid/remove the file lock, so at the next run I can re-upload that same file even when the previous filestream already locked the file.

Any help would be appreciate it, Thanks.

like image 771
Fer Avatar asked Apr 18 '12 16:04

Fer


1 Answers

I don't personally like this sort of solution. Maintaining the connection is not ideal.

Using your example, you could be half way through a 2.5GB file and the process could be aborted. You end up in the situation you have above. To make matters worse, you need to resubmit all of the data that has already been submitted.

I would go the route of handling the blocks myself and appending them to the same file server side. Call a WCF method that indicates a file is starting, upload the data in blocks and then call another method when the upload is complete. IF you are confident that the file names are unique then you could even accomplish this with a single method call.

Something like:

ulong StartFile(string filename) // This returns the data already uploaded
void UploadFile(string filename, ulong start, byte[] data)
void EndFile(string filename) // Just as a safety net

A simple solution to your problem above if you don't want to go the route I outlined above (it doesn't answer your question directly) is to use a temporary file name while you are doing the upload and then rename the file once the upload is complete. You should really be adopting this approach anyway to prevent an application on the server from picking up the file before the upload is complete.

like image 145
Graymatter Avatar answered Nov 11 '22 14:11

Graymatter