I'm using a simple unzip function (as seen below) for my files so I don't have to unzip files manually before they are processed further.
function uncompress($srcName, $dstName) {
    $string = implode("", gzfile($srcName));
    $fp = fopen($dstName, "w");
    fwrite($fp, $string, strlen($string));
    fclose($fp);
} 
The problem is that if the gzip file is large (e.g. 50mb) the unzipping takes a large amount of ram to process.
The question: can I parse a gzipped file in chunks and still get the correct result? Or is there a better other way to handle the issue of extracting large gzip files (even if it takes a few seconds more)?
gzfile() is a convenience method that calls gzopen, gzread, and gzclose.
So, yes, you can manually do the gzopen and gzread the file in chunks.
This will uncompress the file in 4kB chunks:
function uncompress($srcName, $dstName) {
    $sfp = gzopen($srcName, "rb");
    $fp = fopen($dstName, "w");
    while (!gzeof($sfp)) {
        $string = gzread($sfp, 4096);
        fwrite($fp, $string, strlen($string));
    }
    gzclose($sfp);
    fclose($fp);
}
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With