I am trying to upload 300GB (streamed) data to Azure blob storage. The code I am using to perform the upload looks like this:
var stream = blob.OpenWrite();
[...]
// the buffer is filled in with 128KB chunks of data from a larger 300GB file
stream.Write(buffer, offset, count);
After around 8h of uploading, I am receiving the following error message:
at Microsoft.WindowsAzure.Storage.Core.Util.StorageAsyncResult`1.End() in c:\Program Files x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Util\StorageAsyncResult.cs:line 77 at Microsoft.WindowsAzure.Storage.Blob.BlobWriteStream.EndWrite(IAsyncResult asyncResult) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Blob\BlobWriteStream.cs:line 211
ErrorMessage = The client could not finish the operation within specified timeout.
As a side note, my upload speed is around 2MB/s (might be related to the timeout message). Any help would be appreciated.
According to your description and error message, I suggest you could try to change the value of your BlobRequestOptions.MaximumExecutionTime to be longer if you don't want it to time out as quickly.
I suggest you could also enable the storage Diagnostics logs to look at your storage analytics logs and metrics to see if the latency is server or end to end latency. More details about monitor, diagnose, and troubleshoot Microsoft Azure Storage, you could refer to this article.
Besides, I suggest you could try to use Microsoft Azure Storage Data Movement Library to upload large files to blob storage.
This is designed for high-performance uploading, downloading and copying Azure Storage Blob and File. You could install it from the VS nuget package manager.
More details about how to use it, you could refer to this article.
Here is a example:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
"connectstring");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("foobar");
blobContainer.CreateIfNotExists();
string sourcePath = @"yourfilepath";
CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("foobar");
TransferManager.Configurations.ParallelOperations = 64;
SingleTransferContext context = new SingleTransferContext();
context.ProgressHandler = new Progress<TransferStatus>((progress) =>
{
Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred);
});
var task = TransferManager.UploadAsync(
sourcePath, destBlob, null, context, CancellationToken.None);
task.Wait();
It will auto send each chunk to the azure storage as below:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With