Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python equivalent of piping file output to gzip in Perl using a pipe

I need to figure out how to write file output to a compressed file in Python, similar to the two-liner below:

open ZIPPED, "| gzip -c > zipped.gz";
print ZIPPED "Hello world\n";

In Perl, this uses Unix gzip to compress whatever you print to the ZIPPED filehandle to the file "zipped.gz".

I know how to use "import gzip" to do this in Python like this:

import gzip
zipped = gzip.open("zipped.gz", 'wb')
zipped.write("Hello world\n")

However, that is extremely slow. According to the profiler, using that method takes up 90% of my run time since I am writing 200GB of uncompressed data to various output files. I am aware that the file system could be part of the problem here, but I want to rule it out by using Unix/Linux compression instead. This is partially because I have heard that decompressing using this same module is slow as well.

like image 680
bu11d0zer Avatar asked Nov 28 '11 21:11

bu11d0zer


4 Answers

Make sure you use the same compression level when comparing speeds. By default, linux gzip uses level 6, while python uses level 9. I tested this in Python 3.6.8 using gzip version 1.5, compressing 600MB of data from MySQL dump. With default settings:

python module takes 9.24 seconds and makes a file 47.1 MB
subprocess gzip takes 8.61 seconds and makes a file 48.5 MB

After changing it to level 6 so they match:
python module takes 8.09 seconds and makes a file 48.6 MB
subprocess gzip takes 8.55 seconds and makes a file 48.5 MB

# subprocess method
start = time.time()
with open(outfile, 'wb') as f:
    subprocess.run(['gzip'], input=dump, stdout=f, check=True)
print('subprocess finished after {:.2f} seconds'.format(time.time() - start))

# gzip method
start = time.time()
with gzip.open(outfile2, 'wb', compresslevel=6) as z:
    z.write(dump)
print('gzip module finished after {:.2f} seconds'.format(time.time() - start))
like image 51
Elliott B Avatar answered Oct 14 '22 22:10

Elliott B


Try something like this:

from subprocess import Popen, PIPE
f = open('zipped.gz', 'w')
pipe = Popen('gzip', stdin=PIPE, stdout=f)
pipe.communicate('Hello world\n')
f.close()
like image 38
Moishe Lettvin Avatar answered Oct 14 '22 22:10

Moishe Lettvin


ChristopheD's suggestion of using the subprocess module is an appropriate answer to this question. However, it's not clear to me that it will solve your performance problems. You would have to measure the performance of the new code to be sure.

To convert your sample code:

import subprocess

p = subprocess.Popen("gzip -c > zipped.gz", shell=True, stdin=subprocess.PIPE)
p.communicate("Hello World\n")

Since you need to send large amounts of data to the sub-process, you should consider using the stdin attribute of the Popen object. For example:

import subprocess

p = subprocess.Popen("gzip -c > zipped.gz", shell=True, stdin=subprocess.PIPE)
p.stdin.write("Some data")

# Write more data here...

p.communicate() # Finish writing data and wait for subprocess to finish

You may also find the discussion at this question helpful.

like image 10
srgerg Avatar answered Oct 15 '22 00:10

srgerg


Using the gzip module is the official one-way-to-do-it and it's unlikely that any other pure python approach will go faster. This is especially true because the size of your data rules out in-memory options. Most likely, the fastest way is to write the full file to disk and use subprocess to call gz on that file.

like image 2
Raymond Hettinger Avatar answered Oct 14 '22 22:10

Raymond Hettinger