What I want to implement is:
In the front end, I use the html5 file api to read the file, and then upload the file's content to the php backend using ajax, and it's ok if the filesize is small. However,if the file is big enough, it causes chrome to crash. So I split the large file into chunks using file.slice, when all chunks are uploaded to the php, merge the chunks into a single complete one.
the code is as follows:
the front end:
<style>
#container {
min-width:300px;
min-height:200px;
border:3px dashed #000;
}
</style>
<div id='container'>
</div>
<script>
function addDNDListener(obj){
obj.addEventListener('dragover',function(e){
e.preventDefault();
e.stopPropagation();
},false);
obj.addEventListener('dragenter',function(e){
e.preventDefault();
e.stopPropagation();
},false);
obj.addEventListener('drop',function(e){
e.preventDefault();
e.stopPropagation();
var ul = document.createElement("ul");
var filelist = e.dataTransfer.files;
for(var i=0;i<filelist.length;i++){
var file = filelist[i];
var li = document.createElement('li');
li.innerHTML = '<label id="'+file.name+'">'+file.name+':</label> <progress value="0" max="100"></progress>';
ul.appendChild(li);
}
document.getElementById('container').appendChild(ul);
for(var i=0;i<filelist.length;i++){
var file = filelist[i];
uploadFile(file);
}
},false);
}
function uploadFile(file){
var loaded = 0;
var step = 1024*1024;
var total = file.size;
var start = 0;
var progress = document.getElementById(file.name).nextSibling;
var reader = new FileReader();
reader.onprogress = function(e){
loaded += e.loaded;
progress.value = (loaded/total) * 100;
};
reader.onload = function(e){
var xhr = new XMLHttpRequest();
var upload = xhr.upload;
upload.addEventListener('load',function(){
if(loaded <= total){
blob = file.slice(loaded,loaded+step+1);
reader.readAsBinaryString(blob);
}else{
loaded = total;
}
},false);
xhr.open("POST", "upload.php?fileName="+file.name+"&nocache="+new Date().getTime());
xhr.overrideMimeType("application/octet-stream");
xhr.sendAsBinary(e.target.result);
};
var blob = file.slice(start,start+step+1);
reader.readAsBinaryString(blob);
}
window.onload = function(){
addDNDListener(document.getElementById('container'));
if(!XMLHttpRequest.prototype.sendAsBinary){
XMLHttpRequest.prototype.sendAsBinary = function(datastr) {
function byteValue(x) {
return x.charCodeAt(0) & 0xff;
}
var ords = Array.prototype.map.call(datastr, byteValue);
var ui8a = new Uint8Array(ords);
try{
this.send(ui8a);
}catch(e){
this.send(ui8a.buffer);
}
};
}
};
</script>
the php code:
<?php
$filename = "upload/".$_GET['fileName'];
//$filename = "upload/".$_GET['fileName']."_".$_GET['nocache'];
$xmlstr = $GLOBALS['HTTP_RAW_POST_DATA'];
if(empty($xmlstr)){
$xmlstr = file_get_contents('php://input');
}
$is_ok = false;
while(!$is_ok){
$file = fopen($filename,"ab");
if(flock($file,LOCK_EX)){
fwrite($file,$xmlstr);
flock($file,LOCK_UN);
fclose($file);
$is_ok = true;
}else{
fclose($file);
sleep(3);
}
}
The problem is, after the chunks of the file all being uploaded to the server and merged into a new one, the total file size is smaller than the original, and the merged one is broken. Where is the problem and how to fix it?
To enable the chunk upload, set the size to chunkSize option of the upload and it receives the value in bytes . The chunk upload functionality separates the selected files into blobs of the data or chunks. These chunks are transmitted to the server using an AJAX request.
Ajax file uploadsA JavaScript method must be coded to initiate the asynchronous Ajax based file upload; A component must exist on the server to handle the file upload and save the resource locally; The server must send a response to the browser indicating the JavaScript file upload was successful; and.
Upload your files to a cloud storage space, and share them or email them to others. Using a cloud storage space like Google Drive, Dropbox, or OneDrive is one of the easiest and most popular methods for sending large files.
xhr.send(blob.slice(0,10))
Only time it's okey to read/slice the file is if you are deciding to encrypt/decrypt/zip the files before sending it to the server.
But only for a limited time until all browser start supporting streams.
Then you should take a look at fetch and ReadableStream
fetch(url, {method: 'post', body: new ReadableStream({...})})
if you just need to forward the blob to the server, just do:
xhr.send(blob_or_file)
and the browser will take care of reading it (correctly) and not consume any memory. And the file can be however large the file/blob is
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With