Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Allow gzip on existing files

I have static assets stored in GCS and I'd like to serve them gzipped (but they were uploaded without compression). Is there any way to set files to be compressed without downloading and re-uploading them in gzipped format?

I tried setting the content-encoding header with gsutil (i.e., gsutil setmeta -h 'Content-Encoding:gzip' <some_object_uri>, but it just led to a "Service Unavailable" on the file (which I assume is from the server attempting to ungzip the file and failing or something like that).

like image 644
Jeff Tratner Avatar asked Oct 25 '14 21:10

Jeff Tratner


1 Answers

There is no way to compress the objects without downloading them and re-uploading.

However, you can have gsutil do this for you, and if you run it from a Google Compute Engine (GCE) Virtual Machine (VM), you'll only be charged for operation counts, not for bandwidth.

Also, regarding setting the content-encoding header with setmeta, you're right in your interpretation of what happened. You set the metadata on the object to indicate that it contained gzip data, but the contents did not contain a valid gzip stream, so when you try to download it with Accept-Encoding: gzip, the GCS service tries to decompress the stream and fails.

I'd suggest downloading the bucket to the local disk on a GCE VM:

gsutil cp -r gs://bucket /path/to/local/disk

Then, use the -z option to indicate which file extensions to gzip:

gsutil cp -z js,css,html -r /path/to/local/disk gs://bucket
like image 177
jterrace Avatar answered Oct 17 '22 06:10

jterrace