Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

fwrite() more than 2 GiB? [duplicate]

I have a set of files that I want to concatenate (each represents a part from a multi-part download).

Each splitted file is about 250MiB in size, and I have a variable number of them.

My concatenation logic is straight-forward:

if (is_resource($handle = fopen($output, 'xb')) === true)
{
    foreach ($parts as $part)
    {
        if (is_resource($part = fopen($part, 'rb')) === true)
        {
            while (feof($part) !== true)
            {
                fwrite($handle, fread($part, 4096));
            }

            fclose($part);
        }
    }

    fclose($handle);
}

It took me a while to trace it down but, apparently, whenever I have more than 8 individual parts (totaling 2GiB) my output file gets truncated to 2147483647 bytes (reported by sprintf('%u', $output)).

I suppose this is due to some kind of 32-bit internal counter used by fopen() or fwrite().

How can I work around this problem (preferably using only PHP)?

like image 839
Alix Axel Avatar asked Nov 02 '22 12:11

Alix Axel


1 Answers

As a workaround, you could use the shell. If the code must be portable, this would only include about two variants for Windows and Linux (covering MacOS as well).

Linux

cat file1.txt file2.txt  > file.txt

Windows

copy file1.txt+file1.txt file.txt

Note that when creating a command line, escaping the variable arguments is very important. Use escapeshellarg() to wrap the filenames (see http://de1.php.net/escapeshellarg).

To detect whether you are on Windows or Linux, have a look at the constant PHP_OS. (best explained here: http://www.php.net/manual/en/function.php-uname.php)

like image 178
Sven Avatar answered Nov 13 '22 23:11

Sven