My larger problem is that I'd like to use HTML5 Drag and Drop to allow image uploads to my S3 bucket via CORS. I'm able to get something into S3, but it always ends up as what appears to be base64 encoded content.
myFileReader.readAsArrayBuffer(f);
//[...]
function on_onload(file_name, file_type, file_data) {
// file_name and file_type are bound to the function via a closure,
// file_data is passed in during the actual callback invocation.
var xhr = new XMLHttpRequest();
var fd = new FormData();
// code that sets AWS credentials for fd omitted
// _arrayBufferToBase64() just does a binary to base64 conversion
// as described in https://stackoverflow.com/questions/9267899/arraybuffer-to-base64-encoded-string
fd.append('file', _arrayBufferToBase64(file_data));
xhr.open('POST', my_aws_bucket_endpoint, true);
xhr.send(fd);
}
_arrayBufferToBase64()
is just the looped code from this answer.
After attempting to upload foo.jpg
:
$ wget [my_uri]/foo.jpg
[...]
HTTP request sent, awaiting response... 200 OK
Length: 295872 (289K) [image/jpeg]
Saving to: 'foo.jpg'
$ file foo.jpg
foo.jpg: ASCII text, with very long lines, with no line terminators
$ head -c 20 foo.jpg
/9j/4AAQSkZJRgABAQEA
If I try to use readAsBinaryString()
as described in this answer, and then assign the returned data to the 'file'
key, no data is sent and I end up with a zero-length file in my S3 bucket.
Answering my own question:
It turns out that you don't need to use a FileReader
at all. The File
object that comes from a DataTransfer
event is already compatible with FormData
.
An example is available here.
All you need to do to upload your files to S3 is modify that example by setting your credentials and signature in your FormData
prior to kicking off the XMLHttpRequest
to S3.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With