I've managed to load a video-track of a movie frame by frame into an OpenGL
texture with AVFoundation
. I followed the steps described in the answer here: iOS4: how do I use video file as an OpenGL texture?
and took some code from the GLVideoFrame
sample from WWDC2010
which can be downloaded here.
How do I play the audio-track of the movie synchronously to the video? I think it would not be a good idea to play it in a separate player, but to use the audio-track of the same AVAsset
.
AVAssetTrack* audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
I retrieve a videoframe and it's timestamp in the CADisplayLink-callback via
CMSampleBufferRef sampleBuffer = [self.readerOutput copyNextSampleBuffer];
CMTime timestamp = CMSampleBufferGetPresentationTimeStamp( sampleBuffer );
where readerOutput
is of type AVAssetReaderTrackOutput*
How to get the corresponding audio-samples? And how to play them?
Edit:
I've looked around a bit and I think, best would be to use AudioQueue
from the AudioToolbox.framework
using the approach described here: AVAssetReader and Audio Queue streaming problem
There is also an audio-player in the AVFoundation
: AVAudioPlayer
. But I don't know exactly how I should pass data to its initWithData
-initializer which expects NSData
. Furthermore, I don't think it's the best choice for my case because a new AVAudioPlayer
-instance would have to be created for every new chunk of audio samples, as I understand it.
Any other suggestions?
What's the best way to play the raw audio samples which I get from the AVAssetReaderTrackOutput
?
You want do do an AV composition. You can merge multiple media sources, synchronized temporally, into one output.
http://developer.apple.com/library/ios/#DOCUMENTATION/AVFoundation/Reference/AVComposition_Class/Reference/Reference.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With