Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Uploading blobs as Multi-part upload causes net::ERR_FILE_NOT_FOUND on Chrome after 500mb

I'm having a really weird issue only on Google Chrome and Chromium.

The background is:

I upload files to my server using the multi-part upload method, meaning that I break the files into chunks of 10mb and send each chunk to the server. This works flawlessly in all browsers with files of any size, the issue started when I needed to encrypt each chunk.

For encryption I use CryptoJS and, before uploading the chunk, I encrypt it and get the resulting Blob to upload, this works fine on Chrome when I have to upload less than 50 chunks (50 blobs, around 500mb in total), after that I get a POST http://(...) net::ERR_FILE_NOT_FOUND.

Weirdly, this works on all of the other browsers, including Opera which is basically Chrome nowadays, except Chrome and Chromium. I tested it on IE, Firefox, Edge, Safari, Opera, Chrome and Chromium.

Below you can see how my code works so you guys can have an idea, this is not the real code I use in the app but, rather, it's a test code I wrote that yields the same result.

Instead of getting a slice (File.slice) of the File I'm going to upload as a chunk and encrypting it to get the blob, I'm going to generate a bogus blob with the size of my chunk. I put the setTimeout to simulate the time it takes to encrypt a blob. Like I said before, I get the same result as my real code by doing this:

function uploadNext(prevResponse) {  
    if (currentPart == totalPartsFile)
        return;

    //var chunk = getNextChunk();
    var totalSize = file.size;

    setTimeout(function() {
        var blob = new Blob([new ArrayBuffer(constants.chunkSize)], {
            type: 'application/octet-string',
            name: file.name
        });

        console.log(blob);

        blob.encrypted = true;
        blob.key = encryptionKey;
        blob.mimeType = file.mimeType;
        blob.name = file.name;
        blob.originalFileSize = originalFileSize || file.size;
        
        uploadFile(objectId, currentPart, blob, totalSize, prevResponse, function(resp) {
            uploadNext(resp);
        });
    }, 1000);
}

So, the code above is where my blob is generated, below there's the upload part:

function uploadFile (objectId, index, blob, totalSize, prevResponse, callback) {

    var format = "encrypted";
    var params = "?format=" + format + (format === "encrypted" ? "&encoding=base64" : "");
    var endPoint = constants.contentServiceUrl + resourceService.availableResources.addContents.link.split(':objectId').join(objectId) + params;

    var formData = new FormData();

    formData.append("totalFileSizeBytes", totalSize);
    formData.append("partIndex", index);
    formData.append("partByteOffset", previousOffset);
    formData.append("chunkSize", blob.size);
    formData.append("totalParts", totalPartsFile);
    formData.append("filename", blob.name);

    if (currentPart != 0) {
        formData.append("uploadId", prevResponse.uploadId);
        formData.append("bucket", prevResponse.bucket);
    }

    if (finalChunk) {
        for (var key in etags1) {
            formData.append("etags[" + key + "]", etags1[key]);
        }
    }

    formData.append("data", blob);

    previousOffset += blob.size;

    var request = {
        method: 'POST',
        url: endPoint,
        data: formData,
        headers: {
            'Content-Type': 'multipart/form-data'
        }
    }

    $http(request)
        .success(function(d) {
            _.extend(etags1, d.etags);
            console.log(d);
            callback(d);
        })
        .error(function(d) {
        console.log(d);
    });                                                
}

Of course there are other supporting variables and code that I didn't put here, but this is enough to give an idea of what we're dealing with.

In this example I'm using AngularJS' $http module, but I've tried with pure XMLHttpRequest as well and I got the same result.

Like I said, I only get the POST http://(...) net::ERR_FILE_NOT_FOUND with files bigger than 499mb (50+ chunks) and only in Chrome.

I'm posting this here as I've been looking for a solution but I couldn't find anything related to this problem, the closest thing I found on the internet was this issue in the Chromium project forum:

https://code.google.com/p/chromium/issues/detail?id=375297

At this point I really don't know what to do anymore so I'd like to know if anyone has had a similar problem in the past and could fix it somehow.

Thank you for the answers in advance.

like image 495
Eric.M Avatar asked Jan 14 '16 02:01

Eric.M


2 Answers

The chrome can only allocate 500mb for any blob, so if you try to allocate 500mb + 1 byte, It will clearly ignore that byte, to solve this you will have to read file in chunks of 499mb and then you will have to merge file at server.

Or you can try something like ZipJS and then upload the zip, it worked for me.

    var zip = new JSZip();
    zip.file("file1", "content1");
    zip.file("file2", "content2");
like image 73
MehulJoshi Avatar answered Oct 05 '22 10:10

MehulJoshi


At last chromium source files, I had found a blob limits.

  1. ChromeOS:

    • Ram - 20%
    • Disk - 50% Note: The disk is the user partition, so the operating system can still function if this is full.
  2. Android:

    • RAM - 1%
    • Disk - 6%
  3. Desktop:
    • Ram - 20%, or 2 GB if x64.
    • Disk - 10%

chromium repo link: https://cs.chromium.org/chromium/src/storage/browser/blob/blob_memory_controller.cc?l=63

like image 24
Alex Nikulin Avatar answered Oct 05 '22 10:10

Alex Nikulin