Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can I use AVFoundation to stream downloaded video frames into an OpenGL ES texture?

I've been able to use AVFoundation's AVAssetReader class to upload video frames into an OpenGL ES texture. It has a caveat, however, in that it fails when used with an AVURLAsset that points to remote media. This failure isn't well documented, and I'm wondering if there's any way around the shortcoming.

like image 401
Matt Wilding Avatar asked Sep 19 '12 18:09

Matt Wilding


1 Answers

There's some API that was released with iOS 6 that I've been able to use to make the process a breeze. It doesn't use AVAssetReader at all, and instead relies on a class called AVPlayerItemVideoOutput. An instance of this class can be added to any AVPlayerItem instance via a new -addOutput: method.

Unlike the AVAssetReader, this class will work fine for AVPlayerItems that are backed by a remote AVURLAsset, and also has the benefit of allowing for a more sophisticated playback interface that supports non-linear playback via -copyPixelBufferForItemTime:itemTimeForDisplay: (instead of of AVAssetReader's severely limiting -copyNextSampleBuffer method.


SAMPLE CODE

// Initialize the AVFoundation state
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

    NSError* error = nil;
    AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
    if (status == AVKeyValueStatusLoaded)
    {
        NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
        AVPlayerItemVideoOutput* output = [[[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings] autorelease];
        AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
        [playerItem addOutput:[self playerItemOutput]];
        AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];

        // Assume some instance variable exist here. You'll need them to control the
        // playback of the video (via the AVPlayer), and to copy sample buffers (via the AVPlayerItemVideoOutput).
        [self setPlayer:player];
        [self setPlayerItem:playerItem];
        [self setOutput:output];
    }
    else
    {
        NSLog(@"%@ Failed to load the tracks.", self);
    }
}];

// Now at any later point in time, you can get a pixel buffer
// that corresponds to the current AVPlayer state like this:
CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:[[self playerItem] currentTime] itemTimeForDisplay:nil];

Once you've got your buffer, you can upload it to OpenGL however you want. I recommend the horribly documented CVOpenGLESTextureCacheCreateTextureFromImage() function, because you'll get hardware acceleration on all the newer devices, which is much faster than glTexSubImage2D(). See Apple's GLCameraRipple and RosyWriter demos for examples.

like image 185
Matt Wilding Avatar answered Nov 17 '22 19:11

Matt Wilding