We have a staging server setup to use a different S3 bucket from our production server, but that requires us to manually sync the images between the buckets in order to see images on staging. As we have tens of thousands of images (growing daily), this is not viable.
Is there anyway to configure Carrierwave to read images from our production S3, but write any new images to staging S3 (so as to not contaminate our production image store)?
UPDATE: I've tried my hand at creating a custom storage engine for CarrierWave that would do this (see this gist--basically identical to the fog storage engine except for line 228), but I'm getting this error when trying to fetch images:
Excon::Errors::SocketError (hostname does not match the server certificate (OpenSSL::SSL::SSLError)):
lib/carrier_wave/storage/dual_fog.rb:214:in `exists?'
lib/carrier_wave/storage/dual_fog.rb:228:in `public_url'
lib/carrier_wave/storage/dual_fog.rb:267:in `url'
Anyone know why that is? As you can see from the code in the gist, I want this solution to be able to read from staging, and fallback to production if no image is found on staging. All write operations should only go to staging, however.
There are several ways to use S3. If you have dots in your bucket and want to do SSL, you'll need to force your URLs to look like this: https://s3.amazonaws.com/staging.asset.domain.com/blah
. You cannot use https://yourdomain.com/blah
because amazon doesn't have your cert. You also cannot use https://staging.asset.domain.com.s3.amazonaws.com/blah
because the certs only go one level deep. (i.e. if your bucket had no dots, it would work.)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With