Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to merge two video files to one file, one screen in iOS?

I'm new to video programming. I'm trying to exercise it but I'm having trouble, which merges two video files to one.

The merge I mean is as follows..

I have first video like this enter image description it i

Second video also like this enter image description here

I want them to merge like this enter image description here

I didn't want to use 2 video players because I want to send the merged video file to someone. I searched all day to solve this, but I could't find how to.

I wrote code referencing this link but it shows first video only, not merged.

My Code:

NSURL *firstURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"video1" ofType:@"mp4"]]
AVURLAsset  *firstAsset = [[AVURLAsset alloc]initWithURL:firstURL options:nil];

NSURL *secondURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"video2" ofType:@"mp4"]];
VURLAsset  *secondAsset = [[AVURLAsset alloc]initWithURL:secondURL options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                  preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
                    ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                     atTime:kCMTimeZero error:nil];

AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];

[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)
                     ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                      atTime:kCMTimeZero error:nil];

[secondTrack setPreferredTransform:CGAffineTransformMakeScale(0.25f,0.25f)];

NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"FinalVideo.mov"]];

NSLog(@"%@", outputFilePath);

NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
    [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];


AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
assetExport.outputFileType = @"com.apple.quicktime-movie";
assetExport.outputURL = outputFileUrl;

[assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) {

     switch (assetExport.status) {
         case AVAssetExportSessionStatusFailed:
             NSLog(@"AVAssetExportSessionStatusFailed");
             break;
         case AVAssetExportSessionStatusCompleted:
             NSLog(@"AVAssetExportSessionStatusCompleted");
             break;
         case AVAssetExportSessionStatusWaiting:
             NSLog(@"AVAssetExportSessionStatusWaiting");
             break;
         default:
             break;
     }
 }
 ];

What am I missing? I don't know how I can approach this to solve the problem.

Appreciate any ideas. Thanks.

Edit:

i made a new code which referenced a link matt wrote, thanks matt. but when i tried to export it, only first video was exported. not together.. :(

my new code is..

NSURL *originalVideoURL1 = [[NSBundle mainBundle] URLForResource:@"video1" withExtension:@"mov"];
NSURL *originalVideoURL2 = [[NSBundle mainBundle] URLForResource:@"video2" withExtension:@"mov"];


AVURLAsset *firstAsset = [AVURLAsset URLAssetWithURL:originalVideoURL1 options:nil];
AVURLAsset *secondAsset = [AVURLAsset URLAssetWithURL:originalVideoURL2 options:nil];

AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init]; //[AVMutableComposition composition];

NSError *error = nil;
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];

if(error) {
    NSLog(@"firstTrack error!!!. %@", error.localizedDescription);
}

AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];

if(error) {
    NSLog(@"secondTrack error!!!. %@", error.localizedDescription);
}


AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);

AVMutableVideoCompositionLayerInstruction *firstLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
CGAffineTransform scale = CGAffineTransformMakeScale(0.7, 0.7);
CGAffineTransform move = CGAffineTransformMakeTranslation(230, 230);
[firstLayerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:kCMTimeZero];

AVMutableVideoCompositionLayerInstruction *secondLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
CGAffineTransform secondScale = CGAffineTransformMakeScale(1.2, 1.5);
CGAffineTransform secondMove = CGAffineTransformMakeTranslation(0, 0);
[secondLayerInstruction setTransform:CGAffineTransformConcat(secondScale, secondMove) atTime:kCMTimeZero];

mainInstruction.layerInstructions = @[firstLayerInstruction, secondLayerInstruction];

AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = @[mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = CGSizeMake(640, 480);

AVPlayerItem *newPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
newPlayerItem.videoComposition = mainCompositionInst;

AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:newPlayerItem];

AVPlayerLayer *playerLayer =[AVPlayerLayer playerLayerWithPlayer:player];

[playerLayer setFrame:self.view.bounds];
[self.view.layer addSublayer:playerLayer];
[player seekToTime:kCMTimeZero];
[player play]; // play is Good!!


NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];

NSString *tempS2 = [documentsDirectory stringByAppendingPathComponent:@"FinalVideo.mov"];

if([[NSFileManager defaultManager] fileExistsAtPath:tempS2])
{
    [[NSFileManager defaultManager] removeItemAtPath:tempS2 error:nil];
}


NSURL *url = [[NSURL alloc] initFileURLWithPath: tempS2];

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
                                       initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];

exportSession.outputURL=url;

NSLog(@"%@", [exportSession supportedFileTypes]);

exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
    if (exportSession.status==AVAssetExportSessionStatusFailed) {
        NSLog(@"failed");
    }
    else {
        NSLog(@"AudioLocation : %@",tempS2);
    }
}];

how can i export my mixComposition and layerInstruction both?

please give me a little more ideas.

Thanks.

like image 730
MoonSoo Avatar asked Nov 08 '22 22:11

MoonSoo


1 Answers

With reference to the code in your second edit, just as you've told the AVPlayerItem about your AVMutableVideoComposition, you need to also tell the AVAssetExportSession too:

exportSession.videoComposition = mainCompositionInst;
// exportAsynchronouslyWithCompletionHandler etc

N.B. make sure you choose the longer of the two track durations when setting your instruction duration:

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMaximum(firstAsset.duration, secondAsset.duration));

AVPlayer doesn't mind if you get this wrong, but AVAssetExportSession does and will return an AVErrorInvalidVideoComposition (-11841) error.

N.B. 2 Your AVPlayer isn't actually going out of scope, but it makes me nervous when I look at it. I'd assign it to a property if I were you.

like image 54
Rhythmic Fistman Avatar answered Nov 14 '22 23:11

Rhythmic Fistman