Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Getting a video from S3 and Uploading to YouTube in PHP

I have some code working that uploads a video file up to YouTube:

$yt = new Zend_Gdata_YouTube($httpClient);

// create a new VideoEntry object
$myVideoEntry = new Zend_Gdata_YouTube_VideoEntry();

// create a new Zend_Gdata_App_MediaFileSource object
$filesource = $yt->newMediaFileSource('file.mov');
$filesource->setContentType('video/quicktime');
// set slug header
$filesource->setSlug('file.mov');

I have videos in S3 and I want to upload them to YouTube. The video in our S3 account is public, so i can use a command like wget. Should I run a command that wgets the video file and downloads it locally before I run this script (shell_exec("wget ".$s3videoURL))?

Or should I try to enter the MediaFileSource as the URL of the S3 file itself?

Mainly, I just need stability (not a solution subject to frequent time-outs); speed and local storage isn't really important (I can locally delete video file once its been uploaded).

What would be the best way to go about this?

Thanks!

Update: I should probably mention that this script is going to be uploading about 5 videos to YouTube per execution.

like image 885
SSH This Avatar asked Apr 10 '12 22:04

SSH This


1 Answers

This is an old question but i believe i have a better answer.

You don't have to write video to HDD and you can't keep the whole thing in RAM (I assume it is a big file).

You can use PHP AWS SDK and Google Client libraries to buffer file from S3 and send it to YouTube on the fly. Use registerStreamWrapper method to register S3 as file system and use resumable uploads from YouTube API. Then all you have to do is reading chunks from S3 with fread and sending them to YouTube. This way you can even limit the RAM usage.

I assume you created the video object ($video in code) from Google_Video class. This is a complete code.

<?php
require_once 'path/to/libraries/aws/vendor/autoload.php';
require_once 'path/to/libraries/google-client-lib/autoload.php';

use Aws\S3\S3Client;

$chunkSizeBytes = 2 * 1024 * 1024; // 2 mb
$streamName = 's3://bucketname/video.mp4';

$s3client = S3Client::factory(array(
                    'key'    => S3_ACCESS_KEY,
                    'secret' => S3_SECRET_KEY,
                    'region' => 'eu-west-1' // if you need to set.
                ));
$s3client->registerStreamWrapper();

$client = new Google_Client();
$client->setClientId(YOUTUBE_CLIENT_ID);
$client->setClientSecret(YOUTUBE_CLIENT_SECRET);
$client->setAccessToken(YOUTUBE_TOKEN);

$youtube = new Google_YoutubeService($client);
$media = new Google_MediaFileUpload('video/*', null, true, $chunkSizeBytes);

$filesize = filesize($streamName); // use it as a reguler file.
$media->setFileSize($filesize);

$insertResponse = $youtube->videos->insert("status,snippet", $video, array('mediaUpload' => $media));
$uploadStatus = false;

$handle = fopen($streamName, "r");
$totalReceived = 0;
$chunkBuffer = '';
while (!$uploadStatus && !feof($handle)) {
    $chunk = fread($handle, $chunkSizeBytes);
    $chunkBuffer .= $chunk;
    $chunkBufferSize = strlen($chunkBuffer);
    if($chunkBufferSize > $chunkSizeBytes) {
        $fullChunk = substr($chunkBuffer, 0, $chunkSizeBytes);
        $leapChunk = substr($chunkBuffer, $chunkSizeBytes);
        $uploadStatus = $media->nextChunk($insertResponse, $fullChunk);
        $totalSend += strlen($fullChunk);

        $chunkBuffer = $leapChunk;
        echo PHP_EOL.'Status: '.($totalReceived).' / '.$filesize.' (%'.(($totalReceived / $filesize) * 100).')'.PHP_EOL;
    }

    $totalReceived += strlen($chunk);
}

$extraChunkLen = strlen($chunkBuffer);
$uploadStatus = $media->nextChunk($insertResponse, $chunkBuffer);
$totalSend += strlen($chunkBuffer);
fclose($handle);
like image 197
previous_developer Avatar answered Nov 13 '22 12:11

previous_developer