Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can't read files from amazon s3 bucket using aws_s3 (ruby gem) in correct encoding?

I have problem when creating a file in encoding 'utf-8' and reading it from amazon-s3 bucket.

I create a file.

file = File.open('new_file', 'w', :encoding => 'utf-8')
string = "Some ££££ sings"
file.write(string)
file.close

When read from local everything is ok.

open('new_file').read
=> "Some ££££ sings"

Now I upload the file to amazon s3 using aws_s3.

AWS::S3::S3Object.store('new_file', open('new_file'), 'my_bucket')
=> #<AWS::S3::S3Object::Response:0x2214462560 200 OK>

When I read from amazon s3

AWS::S3::S3Object.find('new_file', 'my_bucket').value
=> "Some \xC2\xA3\xC2\xA3\xC2\xA3\xC2\xA3 sings"

open(AWS::S3::S3Object.find('new_file','my_bucket').url).read
=> "Some \xC2\xA3\xC2\xA3\xC2\xA3\xC2\xA3 sings"

I've been trying many things a still can't find solution.

Many Thanks for all the help

M

like image 355
twooface Avatar asked Oct 09 '22 02:10

twooface


1 Answers

I found solution on different forum.

They way to do it is to make sure we are passing/uploading the text file in 'utf-8' in the first place. This it self will not solve the problem but will allow you with certainty force on stream back string encoding.

open(AWS::S3::S3Object.find('new_file','my_bucket').url).read.force_encoding('utf-8')
like image 111
twooface Avatar answered Oct 12 '22 23:10

twooface