I've got a drag and drop function which takes the file that's been dropped on it and converts it to Base64 data. Before, it was uploading to Imgur, whose API supports Base64 uploads, and now I'm working on moving to Amazon S3.
I've seen examples of people using XMLHTTP requests and CORS to upload data to S3, I'm using Amazon's AWS S3 SDK gem to avoid having to sign policies and other things, as the gem does that for me. So what I've done is send the Base64 data to a local controller metod which uses the gem to upload to S3.
The other posts using Ajax i've seen show that S3 supports raw data uploads, but the gem doesn't seem to, as whenever I view the uploads i get broken images. Am I uploading it incorrectly? Is the data in the wrong format? I've tried the basic Base64, atob Base64, and blob urls, but nothing works so far.
JS:
fr.onload = function(event) {
var Tresult = event.target.result;
var datatype = Tresult.slice(Tresult.search(/\:/)+1,Tresult.search(/\;/));
var blob = atob(Tresult.replace(/^data\:image\/\w+\;base64\,/, ''));
$.ajax({
type:"POST",
data:{
file:blob,
contentType: datatype,
extension:datatype.slice(datatype.search(/\//)+1)
},
url:'../uploads/images',
success:function(msg) {
handleStatus(msg,"success");
},
error:function(errormsg) {
handleStatus(errormsg,"error");
}
});
}
Controller method:
def supload
s3 = AWS::S3.new(:access_key_id => ENV['S3_KEY'],:secret_access_key => ENV['S3_SECRET'])
bucket = s3.buckets['bucket-name']
data = params[:file].to_s
type = params[:contentType].to_s
extension = params[:extension].to_s
name = ('a'..'z').to_a.shuffle[0..7].join + ".#{extension}"
obj = bucket.objects.create(name,data,{content_type:type,acl:"public_read"})
url = obj.public_url().to_s
render text: url
end
Edit:
To be clear, I've tried a couple of different formats, the one displayed above is decoded base64. Regular Base64 looks like this:
var Tresult = event.target.result;
var datatype = Tresult.slice(Tresult.search(/\:/)+1,Tresult.search(/\;/));
var blob = Tresult;
$.ajax({
type:"POST",
data:{
file:blob,
mimeType: datatype,
extension:datatype.slice(datatype.search(/\//)+1)
},
url:'../uploads/images',
success:function(msg) {
handleStatus(msg,"success");
},
error:function(errormsg) {
handleStatus(errormsg,"error");
}
});
and a blob url looks like this:
var blob = URL.createObjectURL(dataURItoBlob(Tresut,datatype));
...
function dataURItoBlob(dataURI, dataType) {
var binary = atob(dataURI.split(',')[1]);
var array = [];
for(var i = 0; i < binary.length; i++) {
array.push(binary.charCodeAt(i));
}
return new Blob([new Uint8Array(array)], {type: dataType});
}
To upload folders and files to an S3 bucketSign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/ . In the Buckets list, choose the name of the bucket that you want to upload your folders or files to. Choose Upload.
Browser takes the image/png part and knows that the data following it will be the bytes of a png image. It then sees base64, and knows that the next blob will need to be base64 decoded, before it can then be decoded by its png decoder. It converts the base64 string to bytes.
You have two options for uploading files: AWS Management Console: Use drag-and-drop to upload files and folders to a bucket. AWS CLI: With the version of the tool installed on your local machine, use the command line to upload files and folders to the bucket.
Am I reading this right that you are:
If that's the case, you need to decode the data in step 2 before sending it on to S3. Something like this might work:
require "base64"
def supload
s3 = AWS::S3.new(:access_key_id => ENV['S3_KEY'],:secret_access_key => ENV['S3_SECRET'])
bucket = s3.buckets['bucket-name']
data = Base64.decode64(params[:file].to_s)
type = params[:contentType].to_s
extension = params[:extension].to_s
name = ('a'..'z').to_a.shuffle[0..7].join + ".#{extension}"
obj = bucket.objects.create(name,data,{content_type:type,acl:"public_read"})
url = obj.public_url().to_s
render text: url
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With