Firstly, I'm using Django. Django provides gzip middleware which works just fine. Nginx also provides a gzip module. Would it make more sense to just use Nginx's gzip module because it is implemented purely in C, or are there other performance considerations I'm missing.
Secondly, Django doesn't gzip anything under 200 bytes. Is this because gzipping is too expensive to have any value when compressing output smaller than that?
Thirdly, the API I'm building will be almost purely dynamic with very little caching. Is gzipping expensive enough to make it unpractical to use in this situation (vs a situation where I could cache the gzipped output on the webserver)?
1) I imagine that one gzip compression is enough and nginx is faster, although I haven't benchmarked it yet. GzipMiddleware
utilizes a few built-ins, which might be well optimized, too.
# From http://www.xhaus.com/alan/python/httpcomp.html#gzip
# Used with permission.
def compress_string(s):
import cStringIO, gzip
zbuf = cStringIO.StringIO()
zfile = gzip.GzipFile(mode='wb', compresslevel=6, fileobj=zbuf)
zfile.write(s)
zfile.close()
return zbuf.getvalue()
2) Small gzip'd files just can't take advantage from compression (in fact small files might be bigger, when processed), so one can save time by just skipping this step.
3) You could design a test suite including sample data. Then decide on that data, what works best for your application.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With