Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

FFmpegPHP get thumbnail from external URL

I'm trying to create thumbnails from external videos, mostly MP4s and FLVs. I'm using FFmpegPHP. I already have the thumbnail generation working fine, however, I need to load the video entirely on my server first. Would it be possible to stream only a small part of the video then extract the thumbnail from that?

Here's the code I have so far:

require_once PRIV . 'Vendor/FFmpegPHP/FFmpegAutoloader.php';

// Download the whole video.
$video = file_get_contents($_PUT['video']);
$file = 'path_to_cache';
file_put_contents($file, $video);

$movie = new FFmpegMovie($file);

// Generate the thumbnail.
$thumb = $movie->getFrame($movie->getFrameCount() / 2);
$thumb->resize(320, 240);
imagejpeg($thumb->toGDImage(), 'path_to_thumb');

Anyone has a suggestion?

EDIT

As Brad suggested, here is the updated code:

$file = CACHE . 'moodboard_video_' . rand();
$fh = fopen($file, 'w');
$size = 0;

curl_setopt($ch, CURLOPT_URL, $_PUT['video']);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_WRITEFUNCTION, function($ch, $data) use($fh, &$size){
    $length = fwrite($fh, $data);

    if($length === FALSE) {
        return 0;
    } else {
        $size += $length;
    }

    // Downloads 1MB.
    return $size < 1024 * 1024 * XXXXXX ? $length : 0;
});

curl_exec($ch);

fclose($fh);
curl_close($ch);

// Create the thumbnail.
$thumb = $movie->getFrame(XXXXXX);
$thumb->resize(static::DEFAULT_THUMBNAIL_WIDTH, $thumb->getHeight() / $thumb->getWidth() * static::DEFAULT_THUMBNAIL_WIDTH);
$image = $thumb->toGDImage();
imagejpeg($image, PRIV . static::THUMBNAILS_PATH . $item->getLastInsertIdentifier() . '_' . static::DEFAULT_THUMBNAIL_WIDTH);
like image 772
jValdron Avatar asked Sep 27 '12 11:09

jValdron


1 Answers

FFMPEG is very good about working with broken streams. Because of this, I think you should try downloading the first few megs of that remote media, and try to get a thumbnail from the incomplete file.

First, drop file_get_contents() and use cURL. You can set the CURLOPT_WRITEFUNCTION option to a custom function of yours that writes to a temporary file on disk, chunk by chunk. When you've received enough data, return 0 from your function, and cURL will stop downloading. You will have to experiment to see what the optimum size is. If you get too little data, you will only have the earliest frames to work with, or no frames at all. Too late, and you are wasting bandwidth.

For some container types, file information is at the end of the file. For those, I do not have a suggestion for you. Short of writing your own decoder and splicing on the info at the end with something from the beginning, I do not know of a way to do it without downloading the whole file.

like image 179
Brad Avatar answered Oct 22 '22 22:10

Brad