Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to defeat gzip (or other lossless compression)

By pigeonhole principle, every lossless compression algorithm can be "defeated", i.e. for some inputs it produces outputs which are longer than the input. Is it possible to explicitly construct a file which, when fed to e.g. gzip or other lossless compression program, will lead to (much) larger output? (or, betters still, a file which inflates ad infinitum upon subsequent compressions?)

like image 636
Marcin Kotowski Avatar asked Aug 06 '10 16:08

Marcin Kotowski


2 Answers

Well, I'd assume eventually it'll max out since the bit patterns will repeat, but I just did:

touch file
gzip file -c > file.1
...
gzip file.9 -c > file.10

And got:

  0 bytes: file
 25 bytes: file.1
 45 bytes: file.2
 73 bytes: file.3
103 bytes: file.4
122 bytes: file.5
152 bytes: file.6
175 bytes: file.7
205 bytes: file.8
232 bytes: file.9
262 bytes: file.10

Here are 24,380 files graphically (this is really surprising to me, actually):

alt text http://research.engineering.wustl.edu/~schultzm/images/filesize.png

I was not expecting that kind of growth, I would just expect linear growth since it should just be encapsulating the existing data in a header with a dictionary of patterns. I intended to run through 1,000,000 files, but my system ran out of disk space way before that.

If you want to reproduce, here is the bash script to generate the files:

#!/bin/bash

touch file.0

for ((i=0; i < 20000; i++)); do
    gzip file.$i -c > file.$(($i+1))
done

wc -c file.* | awk '{print $2 "\t" $1}' | sed 's/file.//' | sort -n > filesizes.txt

The resulting filesizes.txt is a tab-delimited, sorted file for your favorite graphing utility. (You'll have to manually remove the "total" field, or script it away.)

like image 59
mjschultz Avatar answered Dec 31 '22 18:12

mjschultz


Random data, or data encrypted with a good cypher would probably be best.

But any good packer should only add constant overhead, once it decides that it can't compress the data. (@Frank). For a fixed overhead, an empty file, or a single character will give the greatest percentage overhead.

For packers that include the filename (e.g. rar, zip, tar), you could of course just make the filename really long :-)

like image 45
Douglas Leeder Avatar answered Dec 31 '22 19:12

Douglas Leeder