Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MultipartFormDataStreamProvider and reading file immediately after its uploaded fails sometimes

I have some Web API code that I've assembled from SO posts and other sites. However, the Task stuff is still new to me. I'm trying to copy an uploaded file to a new location, but sometimes (not all the time) I get an exception while trying to copy the file. The exception indicates that the file is in use by another process. It doesn't happen every time, though. I think I need to move the copy operation somewhere else. Here's my code. Any suggestions?

var provider = new MultipartFormDataStreamProvider(uploadroot);
                var task = Request.Content.ReadAsMultipartAsync(provider).ContinueWith<HttpResponseMessage>(t =>
                {
                    if (t.IsFaulted || t.IsCanceled)
                        throw new HttpResponseException(HttpStatusCode.InternalServerError);

                    var docConversionId = Guid.NewGuid().ToString("N");
                    var sourceFilePath = Path.Combine(uploadroot, provider.FileData.First().LocalFileName);
                    var destinationFilePath = Path.Combine(inboxroot, docConversionId);

                    File.Copy(sourceFilePath, destinationFilePath);

                    var response = new HttpResponseMessage(HttpStatusCode.OK);
                    response.Content = new StringContent(docConversionId);
                    //response.Content.Headers.Add("DocumentConversionId", docConversionId);
                    return response;
                });
                return task;
like image 559
Doug Dawson Avatar asked Jul 19 '13 20:07

Doug Dawson


2 Answers

You could be hitting a known issue with trying to read/delete the file immediately after you use ReadAsMultipartAsync.

Following is the bug related to it(you can take a look the resolution information for more details as to why its happening and also a workaround):

https://aspnetwebstack.codeplex.com/workitem/176

like image 165
Kiran Avatar answered Oct 02 '22 02:10

Kiran


Since the original discussion in codeplex is dead, I'm pasting here the original explanation and workaround when the issue was originally closed back in 2013:

We are closing this issue because the root cause was found to be in framework code. A separate bug has been opened against an external partner team, and this issue will be tracked there.

The changeset: http://aspnetwebstack.codeplex.com/SourceControl/changeset/changes/452b8e1dfa40

reverts an attempted fix where we used FileOptions.WriteThrough to ensure it was not a race between the File Cache and FileStream code. But WriteThrough did not address the core bug and caused a performance degradation.

Impact on user code is this: if you upload a file with MultipartFormDataContent and read it on the server using MultiPartDataStreamProvider, the underlying file on the server may not be fully closed after the ReadAsMultipartAsync completes. There is a small window where native code may still be releasing file resources on another thread.

The impact is that a File.Delete or File.OpenRead done on that file immediately after the async read may fail with an IOException ("file is in use by another process"). We observed about 0.3% failure rate in high-load situations. The only known work-around is to put a try/catch around those operations, delay a short time if an IOException occurs, and retry. A 50ms delay generally works, but allowing for multiple retries is recommended. This allows the native thread to finish releasing its resources. In our stress labs, this catch-delay-and-retry algorithm always succeeds.

like image 42
jrequejo Avatar answered Oct 02 '22 02:10

jrequejo