Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP readfile on a file which is increasing in size

Is it possible to use PHP readfile function on a remote file whose size is unknown and is increasing in size? Here is the scenario:

I'm developing a script which downloads a video from a third party website and simultaneously trans-codes the video into MP3 format. This MP3 is then transferred to the user via readfile.

The query used for the above process is like this:

wget -q -O- "VideoURLHere" | ffmpeg -i - "Output.mp3" > /dev/null 2>&1 &

So the file is fetched and encoded at the same time. Now when the above process is in progress I begin sending the output mp3 to the user via readfile. The problem is that the encoding process takes some time and therefore depending on the users download speed readfile reaches an assumed EoF before the whole file is encoded, resulting in the user receiving partial content/incomplete files.

My first attempt to fix this was to apply a speed limit on the users download, but this is not foolproof as the encoding time and speed vary with load and this still led to partial downloads.

So is there a way to implement this system in such a way that I can serve the downloads simultaneously along with the encoding and also guarantee sending the complete file to the end user?

Any help is appreciated.

EDIT: In response to Peter, I'm actually using fread(read readfile_chunked):

 <?php
 function readfile_chunked($filename,$retbytes=true) {
            $chunksize = 1*(1024*1024); // how many bytes per chunk
            $totChunk = 0;
            $buffer = '';
            $cnt =0;
            $handle = fopen($filename, 'rb');
            if ($handle === false) {
                return false;
            }
            while (!feof($handle)) {
                //usleep(120000); //Used to impose an artificial speed limit
                $buffer = fread($handle, $chunksize);
                echo $buffer;
                ob_flush();
                flush();
                if ($retbytes) {
                    $cnt += strlen($buffer);
                }
            }
                $status = fclose($handle);
            if ($retbytes && $status) {
                return $cnt;        // return num. bytes delivered like readfile() does.
            }
            return $status;
        }
        readfile_chunked($linkToMp3);
    ?>

This still does not guarantee complete downloads as depending on the users download speed and the encoding speed, the EOF() may be reached prematurely.

Also in response to theJeztah's comment, I'm trying to achieve this without having to make the user wait..so that's not an option.

like image 900
Sathiya Sundaram Avatar asked Jan 27 '13 11:01

Sathiya Sundaram


2 Answers

Since you are dealing with streams, you probably should use stream handling functions :). passthru comes to mind, although this will only work if the download | transcode command is started in your script.

If it is started externally, take a look at stream_get_contents.

like image 113
Peter Avatar answered Nov 04 '22 21:11

Peter


Libevent as mentioned by Evert seems like the general solution where you have to use a file as a buffer. However in your case, you could do it all inline in your script without using a file as a buffer:

<?php
header("Content-Type: audio/mpeg");
passthru("wget -q -O- http://localhost/test.avi | ffmpeg -i - -f mp3 -");
?>
like image 22
spinkus Avatar answered Nov 04 '22 22:11

spinkus