Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop gzip compressed files

I am new to hadoop and trying to process wikipedia dump. It's a 6.7 GB gzip compressed xml file. I read that hadoop supports gzip compressed files but can only be processed by mapper on a single job as only one mapper can decompress it. This seems to put a limitation on the processing. Is there an alternative? like decompressing and splitting the xml file into multiple chunks and recompressing them with gzip.

I read about the hadoop gzip from http://researchcomputing.blogspot.com/2008/04/hadoop-and-compressed-files.html

Thanks for your help.

like image 514
Boolean Avatar asked Apr 12 '11 04:04

Boolean


People also ask

What is gzip compressed file?

gzip is a file format and a software application used for file compression and decompression. The program was created by Jean-loup Gailly and Mark Adler as a free software replacement for the compress program used in early Unix systems, and intended for use by GNU (from where the "g" of gzip is derived).

How do I compress gzip?

Gzip on Windows Servers (IIS Manager)Open up IIS Manager. Click on the site you want to enable compression for. Click on Compression (under IIS) Now Enable static compression and you are done!

Is gzip compressed?

GZIP is a compression technology frequently used for transferring data quickly over the internet. “GZIP” refers to a compression method, software used to compress files with this method, and the file format that results from GZIP compression (usually indicated by the file extension . gz).

How do I compress a folder in Hadoop?

hadoop. io. compress. BZip2Codec; --comma seperated list of hdfs directories to compress input0 = LOAD '$IN_DIR' USING PigStorage(); --single output directory STORE input0 INTO '$OUT_DIR' USING PigStorage();


2 Answers

A file compressed with the GZIP codec cannot be split because of the way this codec works. A single SPLIT in Hadoop can only be processed by a single mapper; so a single GZIP file can only be processed by a single Mapper.

There are atleast three ways of going around that limitation:

  1. As a preprocessing step: Uncompress the file and recompress using a splittable codec (LZO)
  2. As a preprocessing step: Uncompress the file, split into smaller sets and recompress. (See this)
  3. Use this patch for Hadoop (which I wrote) that allows for a way around this: Splittable Gzip

HTH

like image 65
Niels Basjes Avatar answered Oct 09 '22 09:10

Niels Basjes


This is one of the biggest miss understanding in HDFS.

Yes files compressed as a gzip file are not splitable by MapReduce, but that does not mean that GZip as a codec has no value in HDFS and cannot be made splitable.

GZip as a Codec can be used with RCFiles, Sequence Files, Arvo Files, and many more file formats. When the Gzip Codec is used within these splitable formats you get the great compression and pretty good speed from Gzip plus the splitable component.

like image 40
Ted Malaska Avatar answered Oct 09 '22 10:10

Ted Malaska