Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS alternative to QTMovieLayer that has non-nil `contents`?

Background

QTKit (QuickTime Kit) is a Mac framework from the 10.3 days that got some layer additions in 10.5 like for example QTMovieLayer. One of the nice things with QTMovieLayer is that you can access the movie content using the regular content property on the layer and get a CAImageQueue object back. The nice thing with this is that you can create a bunch of regular CALayers and set the image queue as their content and give all the layers their own part of the movie by setting the correct contentRect. This means that you can create something like the image below with only one movie running and you get both synchronization and memory efficiency.

enter image description here

I could post this code but I doubt that it's relevant to my question since the question is about iOS and QTKit doesn't exist there.

My problem

I want to do the same thing on iOS but QTKit doesn't exist there. That is, I'm looking for an alternative to QTMovieLayer that exist on iOS where the content property is not nil (this is the important part) so that I can share the content between multiple layers.

Also, when I manage to get a video to play in multiple layers I want to be able to specify what part of the video should play in what layer (like in the image above)


What I've tried

On iOS you have either AVFoundation or the MediaPlayer framework to play back movies. In AVFoundation you have AVPlayerLayer that can display movies. (You also have AVCaptureVideoPreviewLayer for displaying preview of the camera input and AVSynchronizedLayer for synchronizing but not displaying videos).

Note: in all these examples I successfully get the video to play in the layer with sound and everything. That is not my problem. The problem is that I can't get the contents to share it with other layers.

I have read Technical Note TN2300 - Transitioning QTKit code to AV Foundation but couldn't find anything about sharing video content between layers.

AVPlayerLayer

If I create a player layer like this and try to get the contents I only get nil back

NSURL *videoURL = [[NSBundle mainBundle] URLForResource:@"video" withExtension:@"m4v"];
AVPlayer *player = [AVPlayer playerWithURL:videoURL];
[player play];
AVPlayerLayer *videoLayer = [AVPlayerLayer playerLayerWithPlayer:player];
id videoContent = videoLayer.contents // <-- this is nil :(

AVCaptureVideoPreviewLayer

Even though I'm really interested in playing back a movie file I tried using a capture preview layer but the content is nil just as before

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];

AVCaptureSession *captureSession = [AVCaptureSession new];
[captureSession addInput:input];
AVCaptureVideoPreviewLayer *Layer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];

[captureSession startRunning];

id videoContent = videoLayer.contents // <-- this is still nil :(

AVSynchronizedLayer

From the name of it you may think that the syncronized layer would be what I'm looking for but it is not used to display anything (form header documentation):

An AVSynchronizedLayer is similar to a CATransformLayer in that it doesn't display anything itself but only confers state upon its layer subtree.

MediaPlayer framework

The media player framework doesn't have a layer variant and creating a player view controller and getting the layer of the players view didn't seem like a valid option to me. (Yes, I didn't even bother to try it.)


Recap of the question:

Is there any alternative to QTMovieLayer that exists on iOS where you can get the content of the currently playing video and display parts of it in multiple other layers at the same time?

like image 453
David Rönnqvist Avatar asked Sep 28 '13 10:09

David Rönnqvist


1 Answers

For getting video/audio data buffer from capture session or player session,

  • Create AVCaptureVideoDataOutput / AVCaptureAudioDataOutput object.
  • Confirm one of your to AVCaptureVideoDataOutputSampleBufferDelegate.
  • Add AVCaptureVideoDataOutput to your Capture/Player session.
  • Implement protocol methods. You will receive the CMSampleBufferRef object containing video/audio frames as the media is being captured/played in the captureOutput... method of AVCaptureVideoDataOutputSampleBufferDelegate.

CMSampleBufferRef object contains media frame data, timestamp information and the format description of media. You can then display this frame by converting it into CGImageRef and display it on any view.

You can also specify the desired frame compression format (or uncompressed frame format) in AVCaptureVideoDataOutput.videoSettings property.

like image 88
tejusadiga2004 Avatar answered Nov 12 '22 12:11

tejusadiga2004