The backups are 250MB. I don't think that's very big, but it appears the problem is increasing with the size.
Log from the Backup gem below.
Note the time span; about 37 min into the uploading I get connection reset.
[2015/10/30 09:20:40][message] Storage::S3 started transferring '2015.10.30.09.20.01.myapp_postgres.tar' to bucket 'myapp-backups'.
[2015/10/30 09:57:06][error] ModelError: Backup for Back up PostgreSQL (myapp_postgres) Failed!
[2015/10/30 09:57:06][error] An Error occured which has caused this Backup to abort before completion.
[2015/10/30 09:57:06][error] Reason: Excon::Errors::SocketError
[2015/10/30 09:57:06][error] Connection reset by peer
Did you try the error handing options, which retransmit the file's portions that have failed :
store_with S3 do |s3|
s3.max_retries = 10
s3.retry_waitsec = 30
end
Keep also the chunk size small:
store_with S3 do |s3|
s3.chunk_size = 5 # MiB
end
You may also want to use the Splitter options.
I wuold say for now to use ruby-xz to compress in a smaller file in order to send it more compressed and temprary patch it, then try to see
Excon.defaults[:write_timeout] = 500
or more would do the trick
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With