Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How download big file using PHP (low memory usage)

Tags:

file

php

download

I have to download big file (1xx MB) using PHP.

How can i download this without wasting memory (RAM) for temporary file ?

When i use

$something=file_get_contents('http://somehost.example/file.zip');
file_put_contents($something,'myfile.zip');

I need to have so much memory that size of that file.

Maybe it's possible to download it using any other way ?

For example in parts (for example 1024b), write to disk, and download another part repeating until file will be fully downloaded ?

like image 366
marc Avatar asked Oct 22 '10 19:10

marc


2 Answers

Copy the file one small chunk at a time

/**
 * Copy remote file over HTTP one small chunk at a time.
 *
 * @param $infile The full URL to the remote file
 * @param $outfile The path where to save the file
 */
function copyfile_chunked($infile, $outfile) {
    $chunksize = 10 * (1024 * 1024); // 10 Megs

    /**
     * parse_url breaks a part a URL into it's parts, i.e. host, path,
     * query string, etc.
     */
    $parts = parse_url($infile);
    $i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5);
    $o_handle = fopen($outfile, 'wb');

    if ($i_handle == false || $o_handle == false) {
        return false;
    }

    if (!empty($parts['query'])) {
        $parts['path'] .= '?' . $parts['query'];
    }

    /**
     * Send the request to the server for the file
     */
    $request = "GET {$parts['path']} HTTP/1.1\r\n";
    $request .= "Host: {$parts['host']}\r\n";
    $request .= "User-Agent: Mozilla/5.0\r\n";
    $request .= "Keep-Alive: 115\r\n";
    $request .= "Connection: keep-alive\r\n\r\n";
    fwrite($i_handle, $request);

    /**
     * Now read the headers from the remote server. We'll need
     * to get the content length.
     */
    $headers = array();
    while(!feof($i_handle)) {
        $line = fgets($i_handle);
        if ($line == "\r\n") break;
        $headers[] = $line;
    }

    /**
     * Look for the Content-Length header, and get the size
     * of the remote file.
     */
    $length = 0;
    foreach($headers as $header) {
        if (stripos($header, 'Content-Length:') === 0) {
            $length = (int)str_replace('Content-Length: ', '', $header);
            break;
        }
    }

    /**
     * Start reading in the remote file, and writing it to the
     * local file one chunk at a time.
     */
    $cnt = 0;
    while(!feof($i_handle)) {
        $buf = '';
        $buf = fread($i_handle, $chunksize);
        $bytes = fwrite($o_handle, $buf);
        if ($bytes == false) {
            return false;
        }
        $cnt += $bytes;

        /**
         * We're done reading when we've reached the conent length
         */
        if ($cnt >= $length) break;
    }

    fclose($i_handle);
    fclose($o_handle);
    return $cnt;
}

Adjust the $chunksize variable to your needs. This has only been mildly tested. It could easily break for a number of reasons.

Usage:

copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg');
like image 67
mellowsoon Avatar answered Oct 15 '22 05:10

mellowsoon


you can shell out to a wget using exec() this will result in the lowest memory usage.

<?php
 exec("wget -o outputfilename.tar.gz http://pathtofile/file.tar.gz")
?>

You can also try using fopen() and fread() and fwrite(). That way you onlly download x bytes into memory at a time.

like image 36
Byron Whitlock Avatar answered Oct 15 '22 04:10

Byron Whitlock