I noticed in the iOS documentation for AVAssetWriterInput
you can pass nil
for the outputSettings
dictionary to specify that the input data should not be re-encoded.
The settings used for encoding the media appended to the output. Pass nil to specify that appended samples should not be re-encoded.
I want to take advantage of this feature to pass in a stream of raw H.264 NALs, but I am having trouble adapting my raw byte streams into a CMSampleBuffer
that I can pass into AVAssetWriterInput's appendSampleBuffer
method. My stream of NALs contains only SPS/PPS/IDR/P NALs (1, 5, 7, 8). I haven't been able to find documentation or a conclusive answer on how to use pre-encoded H264 data with AVAssetWriter. The resulting video file is not able to be played.
How can I properly package the NAL units into CMSampleBuffers
? Do I need to use a start code prefix? A length prefix? Do I need to ensure I only put one NAL per CMSampleBuffer
? My end goal is to create an MP4 or MOV container with H264/AAC.
Here's the code I've been playing with:
-(void)addH264NAL:(NSData *)nal
{
dispatch_async(recordingQueue, ^{
//Adapting the raw NAL into a CMSampleBuffer
CMSampleBufferRef sampleBuffer = NULL;
CMBlockBufferRef blockBuffer = NULL;
CMFormatDescriptionRef formatDescription = NULL;
CMItemCount numberOfSampleTimeEntries = 1;
CMItemCount numberOfSamples = 1;
CMVideoFormatDescriptionCreate(kCFAllocatorDefault, kCMVideoCodecType_H264, 480, 360, nil, &formatDescription);
OSStatus result = CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, NULL, [nal length], kCFAllocatorDefault, NULL, 0, [nal length], kCMBlockBufferAssureMemoryNowFlag, &blockBuffer);
if(result != noErr)
{
NSLog(@"Error creating CMBlockBuffer");
return;
}
result = CMBlockBufferReplaceDataBytes([nal bytes], blockBuffer, 0, [nal length]);
if(result != noErr)
{
NSLog(@"Error filling CMBlockBuffer");
return;
}
const size_t sampleSizes = [nal length];
CMSampleTimingInfo timing = { 0 };
result = CMSampleBufferCreate(kCFAllocatorDefault, blockBuffer, YES, NULL, NULL, formatDescription, numberOfSamples, numberOfSampleTimeEntries, &timing, 1, &sampleSizes, &sampleBuffer);
if(result != noErr)
{
NSLog(@"Error creating CMSampleBuffer");
}
[self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];
});
}
Note that I'm calling CMSampleBufferSetOutputPresentationTimeStamp
on the sample buffer inside of the writeSampleBuffer
method with what I think is a valid time before I'm actually trying to append it.
Any help is appreciated.
I managed to get video playback working in VLC but not QuickTime. I used code similar to what I posted above to get H.264 NALs into CMSampleBuffers.
I had two main issues:
To solve #1, I set timing.duration = CMTimeMake(1, fps);
where fps is the expected frame rate. I then set timing.decodeTimeStamp = kCMTimeInvalid;
to mean that the samples will be given in decoding order. Lastly, I set timing.presentationTimeStamp
by calculating the absolute time, which I also used with startSessionAtSourceTime
.
To solve #2, through trial and error I found that giving my NAL units in the following form worked:
[7 8 5] [1] [1] [1]..... [7 8 5] [1] [1] [1]..... (repeating)
Where each NAL unit is prefixed by a 32-bit start code equaling 0x00000001
.
Presumably for the same reason it's not playing in QuickTime, I'm still having trouble moving the resulting .mov file to the photo album (the ALAssetLibrary
method videoAtPathIsCompatibleWithSavedPhotosAlbum
is failing stating that the "Movie could not be played." Hopefully someone with an idea about what's going on can comment. Thanks!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With