I need to alter my routine and have the final outfile be gzipped. I'm trying to figure out what is the best way to gzip a processed file called within a perl subroutine.
For example, I have a sub routine that creates the file (extract_data). Here's the main loop and sub routine:
foreach my $tblist (@tblist)
{
chomp $tblist;
extract_data($dbh, $tblist);
};
$dbh->disconnect;
sub extract_data
{
my($dbh, $tblist) = @_;
my $final_file = "/home/proc/$node-$tblist.dat";
open (my $out_fh, '>', $final_file) or die "cannot create $final_file: $!";
my $sth = $dbh->prepare("...");
$sth->execute();
while (my($uid, $hostnm,$col1,$col2,$col3,$upd,$col5) = $sth->fetchrow_array() ) {
print $out_fh "__my_key__^A$uid^Ehost^A$hostnm^Ecol1^A$col1^Ecol2^A$col2^Ecol3^A$col3^Ecol4^A$upd^Ecol5^A$col5^D";
}
$sth->finish;
close $out_fh or die "Failed to close file: $!";
};
Do I do the gzip within the main or with the sub? What is the best way to do so?
Then my new file would be $final_file =/home/proc/$node-$tblist.dat.gz
thanks.
I know there are modules to do this without using external programs, but since I understand how to use gzip
a lot better than I understand how to use those modules, I just open a process to gzip
and call it a day.
open (my $gzip_fh, "| /bin/gzip -c > $final_file.gz") or die "error starting gzip $!";
...
while (... = $sth->fetchrow_array()) {
print $gzip_fh "__my_key__^A$uid^Ehost^A$hostname..."; # uncompressed data
}
...
close $gzip_fh;
You can use IO::Compress::Gzip, which is in the set of core Perl modules:
use IO::Compress::Gzip qw(gzip $GzipError) ;
my $z = new IO::Compress::Gzip($fileName);
or die "gzip failed: $GzipError\n";
# object interface
$z->print($string);
$z->printf($format, $string);
$z->write($string);
$z->close();
# IO::File mode
print($z $string);
printf($z $format, $string);
close($z);
More details at perldoc
FWIW, there's also IO::Uncompress::Gunzip for reading from gzipped files in a similar fashion.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With