I'm recording small video clips (around one second or so, with both the front and the rear camera, with possible different orientations). And then try to merge them using AVAssetExportSession. I basically make a composition and a videoComposition with the proper transforms and audio & video tracks.
The problem is that on iOS 5 it fails if you have more than 4 video clips and on iOS 6 the limit seems to be 16 clips.
This to me seems really puzzling. Is AVAssetExportSession doing something weird or does it have some undocumented limitation on the number of clips that can be passed to it? Here are some excerpts from my code:
-(void)exportVideo
{
AVMutableComposition *composition = video.composition;
AVMutableVideoComposition *videoComposition = video.videoComposition;
NSString * presetName = AVAssetExportPresetMediumQuality;
AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:presetName];
self.exportSession = _assetExport;
videoComposition.renderSize = CGSizeMake(640, 480);
_assetExport.videoComposition = videoComposition;
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent: @"export.mov"];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
// Delete the currently exported files if it exists
if([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:^{
switch (_assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
NSLog(@"Completed exporting!");
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Failed:%@", _assetExport.error.description);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Canceled:%@", _assetExport.error);
break;
default:
break;
}
}];
}
And here's how the compositions are made:
-(void)setVideoAndExport
{
video = nil;
video = [[VideoComposition alloc] initVideoTracks];
CMTime localTimeline = kCMTimeZero;
// Create the composition of all videofiles
for (NSURL *url in outputFileUrlArray) {
AVAsset *asset = [[AVURLAsset alloc]initWithURL:url options:nil];
[video setVideo:url at:localTimeline];
localTimeline = CMTimeAdd(localTimeline, asset.duration); // Increment the timeline
}
[self exportVideo];
}
And here's the meat of the VideoComposition class:
-(id)initVideoTracks
{
if((self = [super init]))
{
composition = [[AVMutableComposition alloc] init];
addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instructions = [[NSMutableArray alloc] init];
videoComposition = [AVMutableVideoComposition videoComposition];
}
return self;
}
-(void)setVideo:(NSURL*) url at:(CMTime)to
{
asset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableCompositionTrack *compositionTrackVideo = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil];
AVMutableCompositionTrack *compositionTrackAudio = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:to error:nil];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(to, asset.duration));
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrackVideo];
[layerInstruction setTransform: assetTrack.preferredTransform atTime: kCMTimeZero];
[layerInstruction setOpacity:0.0 atTime:CMTimeAdd(to, asset.duration)];
[instructions addObject:layerInstruction];
mainInstruction.layerInstructions = instructions;
videoComposition.instructions = [NSArray arrayWithObject:mainInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
}
Okay, I also contacted Apple about this issue and they gave a response:
"This is a known condition. You are hitting the decoder limit set in AVFoundation."
They also asked me to file a bug report on the issue, since the error message that AVAssetExportSession gives if vague and misleading. So I filed a bug report to apple complaining about the fact that the error message is bad.
So these limits in AVAssetExportSession are confirmed. In iOS 5 the decoder limit is 4 and in iOS 6 it was raised to 16. The main issue here is that the error reported by AVAssetExportSession is bad as it only reports: 11820 "Cannot Complete Export" instead of actually telling us that we have hit a limit.
I have also encountered similar issue. I've managed to fix it by inserting assets into composition, not tracks into mutable tracks. So, in your code for "setVideo" instead of this line:
[compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil];
try this:
[self insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofAsset:asset atTime:to error:nil]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With