Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to combine video clips with different orientation using AVFoundation

I am trying to combine several video clips into one using AVFoundation. I can create a single video using AVMutableComposition using the code below

AVMutableComposition *composition = [AVMutableComposition composition];  AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];  AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];  CMTime startTime = kCMTimeZero;  /*videoClipPaths is a array of paths of the video clips recorded*/  //for loop to combine clips into a single video for (NSInteger i=0; i < [videoClipPaths count]; i++) {      NSString *path = (NSString*)[videoClipPaths objectAtIndex:i];      NSURL *url = [[NSURL alloc] initFileURLWithPath:path];      AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];     [url release];      AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];     AVAssetTrack *audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];      //set the orientation     if(i == 0)     {         [compositionVideoTrack setPreferredTransform:videoTrack.preferredTransform];     }      ok = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:videoTrack atTime:startTime error:nil];     ok = [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:audioTrack atTime:startTime error:nil];      startTime = CMTimeAdd(startTime, [asset duration]); }  //export the combined video NSString *combinedPath = /* path of the combined video*/;  NSURL *url = [[NSURL alloc] initFileURLWithPath: combinedPath];  AVAssetExportSession *exporter = [[[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPreset640x480] autorelease];  exporter.outputURL = url; [url release];  exporter.outputFileType = [[exporter supportedFileTypes] objectAtIndex:0];  [exporter exportAsynchronouslyWithCompletionHandler:^(void){[self combineVideoFinished:exporter.outputURL status:exporter.status error:exporter.error];}]; 

The code above works fine if all the video clips were recorded in the same orientation (portrait or landscape). However if I have a mixture of orientations in the clips, the final video will have part of it rotated 90 degrees to the right (or left).

I was wondering is there a way to transform all clips to the same orientation (e.g. the orientation of the first clip) while composing them. From what I read from the XCode document AVMutableVideoCompositionLayerInstruction seems can be used to transform AVAsset, but I am not sure how to create and apply several different layer instruction to corresponding clips and use then in the composition (AVMutableComposition*)

Any help would be appreciated!

like image 433
Song Avatar asked Jul 04 '11 18:07

Song


People also ask

What is merge in videos?

Video merging is combining or joining multiple video files into a single video file.


1 Answers

This is what I do. I then use an AVAssetExportSession to create the actual file. but I warn you, the CGAffineTransforms are sometimes applied late, so you'll see a or two of the original before the video transforms. I have no clue why this happens, a different combination of videos will yield the expected result, but sometimes its off.

AVMutableComposition *composition = [AVMutableComposition composition];     AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];  videoComposition.frameDuration = CMTimeMake(1,30);  videoComposition.renderScale = 1.0;  AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];  // Get only paths the user selected NSMutableArray *array = [NSMutableArray array]; for(NSString* string in videoPathArray){ if(![string isEqualToString:@""]){     [array addObject:string]; }   self.videoPathArray = array;  float time = 0;  for (int i = 0; i<self.videoPathArray.count; i++) {      AVURLAsset *sourceAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[videoPathArray objectAtIndex:i]] options:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];      NSError *error = nil;      BOOL ok = NO;     AVAssetTrack *sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];      CGSize temp = CGSizeApplyAffineTransform(sourceVideoTrack.naturalSize, sourceVideoTrack.preferredTransform);     CGSize size = CGSizeMake(fabsf(temp.width), fabsf(temp.height));     CGAffineTransform transform = sourceVideoTrack.preferredTransform;      videoComposition.renderSize = sourceVideoTrack.naturalSize;     if (size.width > size.height) {         [layerInstruction setTransform:transform atTime:CMTimeMakeWithSeconds(time, 30)];     } else {          float s = size.width/size.height;          CGAffineTransform new = CGAffineTransformConcat(transform, CGAffineTransformMakeScale(s,s));          float x = (size.height - size.width*s)/2;          CGAffineTransform newer = CGAffineTransformConcat(new, CGAffineTransformMakeTranslation(x, 0));          [layerInstruction setTransform:newer atTime:CMTimeMakeWithSeconds(time, 30)];     }      ok = [compositionVideoTrack insertTimeRange:sourceVideoTrack.timeRange ofTrack:sourceVideoTrack atTime:[composition duration] error:&error];      if (!ok) {         // Deal with the error.         NSLog(@"something went wrong");     }      NSLog(@"\n source asset duration is %f \n source vid track timerange is %f %f \n composition duration is %f \n composition vid track time range is %f %f",CMTimeGetSeconds([sourceAsset duration]), CMTimeGetSeconds(sourceVideoTrack.timeRange.start),CMTimeGetSeconds(sourceVideoTrack.timeRange.duration),CMTimeGetSeconds([composition duration]), CMTimeGetSeconds(compositionVideoTrack.timeRange.start),CMTimeGetSeconds(compositionVideoTrack.timeRange.duration));      time += CMTimeGetSeconds(sourceVideoTrack.timeRange.duration); }  instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction]; instruction.timeRange = compositionVideoTrack.timeRange;  videoComposition.instructions = [NSArray arrayWithObject:instruction]; 

This is what I do. I then use an AVAssetExportSession to create the actual file. but I warn you, the CGAffineTransforms are sometimes applied late, so you'll see a or two of the original before the video transforms. I have no clue why this happens, a different combination of videos will yield the expected result, but sometimes its off.

like image 104
bogardon Avatar answered Oct 08 '22 05:10

bogardon