EDIT: The strangest thing: it seems that when running this code from a full app everything works, but I was always running the creation of the movie from my unit tests, and only there it didn't work. Trying to figure out why is that...
I'm trying to combine video + audio + text using AVMutableComposition and export it to a new video.
My code is based on the AVEditDemo from WWDC '10
I added a purple background to the CATextLayer so I can know for a fact it is exported to the movie, but no text is shown... I tried playing with various fonts, position, color definitions, but nothing helped, so I decided to post the code here and see if anyone stumbled across something similar and can tell me what I'm missing.
Here's the code (self.audio and self.video are AVURLAssets):
CMTime exportDuration = self.audio.duration;
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *videoTrack = [[self.video tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// add the video in loop until the audio ends
CMTime currStartTime = kCMTimeZero;
while (CMTimeCompare(currStartTime, exportDuration) < 0) {
CMTime timeRemaining = CMTimeSubtract(exportDuration, currStartTime);
CMTime currLoopDuration = self.video.duration;
if (CMTimeCompare(currLoopDuration, timeRemaining) > 0) {
currLoopDuration = timeRemaining;
}
CMTimeRange currLoopTimeRange = CMTimeRangeMake(kCMTimeZero, currLoopDuration);
[compositionVideoTrack insertTimeRange:currLoopTimeRange ofTrack:videoTrack
atTime:currStartTime error:nil];
currStartTime = CMTimeAdd(currStartTime, currLoopDuration);
}
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *audioTrack = [self.audio.tracks objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.audio.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil];
AVMutableVideoComposition *videoComposition;
// the text layer part - THIS IS THE PART THAT DOESN'T WORK WELL
CALayer *animatedTitleLayer = [CALayer layer];
CATextLayer *titleLayer = [[CATextLayer alloc] init];
titleLayer.string = @"asdfasdf";
titleLayer.alignmentMode = kCAAlignmentCenter;
titleLayer.bounds = CGRectMake(0, 0, self.video.naturalSize.width / 2, self.video.naturalSize.height / 2);
titleLayer.opacity = 1.0;
titleLayer.backgroundColor = [UIColor purpleColor].CGColor;
[animatedTitleLayer addSublayer:titleLayer];
animatedTitleLayer.position = CGPointMake(self.video.naturalSize.width / 2.0, self.video.naturalSize.height / 2.0);
// build a Core Animation tree that contains both the animated title and the video.
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, self.video.naturalSize.width, self.video.naturalSize.height);
videoLayer.frame = CGRectMake(0, 0, self.video.naturalSize.width, self.video.naturalSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:animatedTitleLayer];
videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, exportDuration);
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer];
videoComposition.instructions = [NSArray arrayWithObject:passThroughInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = self.video.naturalSize;
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.videoComposition = videoComposition;
exportSession.outputURL = [NSURL fileURLWithPath:self.outputFilePath];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^() {
// save the video ...
}];
I ran into the same issue in a different context. In my case, I had moved preparation of the AVMutableComposition to a background thread. Moving that part of preparation back to the main queue/thread made CATextLayer overlays work properly again.
This likely doesn't exactly apply to your unit testing context, but my guess is that CATextLayer/AVFoundation depend on some part of UIKit/AppKit being running/available (a drawing context? a current screen?) in that thread context, which might explain the failure we are both seeing.
I had the problem that almost everything rendered great, also images with CALayer and content set to CGImage. Except the CGTextLayer text, if I set a background color to CGTextLayer, a beginTime and duration that was perfectly rendered too - just the actual text didn't want to appear. That was all on the simulator, then I run it on the phone: And it was perfect.
Conclusion: The simulator renders nice videos... Until you use CATextLayer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With