I have problem when creating a file in encoding 'utf-8' and reading it from amazon-s3 bucket.
I create a file.
file = File.open('new_file', 'w', :encoding => 'utf-8')
string = "Some ££££ sings"
file.write(string)
file.close
When read from local everything is ok.
open('new_file').read
=> "Some ££££ sings"
Now I upload the file to amazon s3 using aws_s3.
AWS::S3::S3Object.store('new_file', open('new_file'), 'my_bucket')
=> #<AWS::S3::S3Object::Response:0x2214462560 200 OK>
When I read from amazon s3
AWS::S3::S3Object.find('new_file', 'my_bucket').value
=> "Some \xC2\xA3\xC2\xA3\xC2\xA3\xC2\xA3 sings"
open(AWS::S3::S3Object.find('new_file','my_bucket').url).read
=> "Some \xC2\xA3\xC2\xA3\xC2\xA3\xC2\xA3 sings"
I've been trying many things a still can't find solution.
M
I found solution on different forum.
They way to do it is to make sure we are passing/uploading the text file in 'utf-8' in the first place. This it self will not solve the problem but will allow you with certainty force on stream back string encoding.
open(AWS::S3::S3Object.find('new_file','my_bucket').url).read.force_encoding('utf-8')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With