Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Mixing Images and Video using AVFoundation

I'm trying to splice in images into a pre-existing video to create a new video file using AVFoundation on Mac.

So far I've read the Apple documentation example,

ASSETWriterInput for making Video from UIImages on Iphone Issues

Mix video with static image in CALayer using AVVideoCompositionCoreAnimationTool

AVFoundation Tutorial: Adding Overlays and Animations to Videos and a few other SO links

Now these have proved to be pretty useful at times, but my problem is that I'm not creating a static watermark or an overlay exactly I want to put in images between parts of the video. So far I've managed to get the video and create blank sections for these images to be inserted and export it.

My problem is getting the images to insert them selves in these blank sections. The only way I can see to feasibly do it is to create a series of layers that are animated to change their opacity at the correct times, but I can't seem to get the animation to work.

The code below is what I'm using to create the video segments and layer animations.

    //https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_Editing.html#//apple_ref/doc/uid/TP40010188-CH8-SW7
    
    // let's start by making our video composition
    AVMutableComposition* mutableComposition = [AVMutableComposition composition];
    AVMutableCompositionTrack* mutableCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    
    AVMutableVideoComposition* mutableVideoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:gVideoAsset];
    
    // if the first point's frame doesn't start on 0
    if (gFrames[0].startTime.value != 0)
    {
        DebugLog("Inserting vid at 0");
        // then add the video track to the composition track with a time range from 0 to the first point's startTime
        [mutableCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, gFrames[0].startTime) ofTrack:gVideoTrack atTime:kCMTimeZero error:&gError];
        
    }
    
    if(gError)
    {
        DebugLog("Error inserting original video segment");
        GetError();
    }
    
    // create our parent layer and video layer
    CALayer* parentLayer = [CALayer layer];
    CALayer* videoLayer = [CALayer layer];
    
    parentLayer.frame = CGRectMake(0, 0, 1280, 720);
    videoLayer.frame = CGRectMake(0, 0, 1280, 720);
    
    [parentLayer addSublayer:videoLayer];
    
    // create an offset value that should be added to each point where a new video segment should go
    CMTime timeOffset = CMTimeMake(0, 600);
    
    // loop through each additional frame
    for(int i = 0; i < gFrames.size(); i++)
    {
    // create an animation layer and assign it's content to the CGImage of the frame
        CALayer* Frame = [CALayer layer];
        Frame.contents = (__bridge id)gFrames[i].frameImage;
        Frame.frame = CGRectMake(0, 720, 1280, -720);
        
        DebugLog("inserting empty time range");
        // add frame point to the composition track starting at the point's start time
        // insert an empty time range for the duration of the frame animation
        [mutableCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(CMTimeAdd(gFrames[i].startTime, timeOffset), gFrames[i].duration)];
        
        // update the time offset by the duration
        timeOffset = CMTimeAdd(timeOffset, gFrames[i].duration);
        
        // make the layer completely transparent
        Frame.opacity = 0.0f;
        
        // create an animation for setting opacity to 0 on start
        CABasicAnimation* frameAnim = [CABasicAnimation animationWithKeyPath:@"opacity"];
        frameAnim.duration = 1.0f;
        frameAnim.repeatCount = 0;
        frameAnim.autoreverses = NO;
        
        frameAnim.fromValue = [NSNumber numberWithFloat:0.0];
        frameAnim.toValue = [NSNumber numberWithFloat:0.0];
        
        frameAnim.beginTime = AVCoreAnimationBeginTimeAtZero;
        frameAnim.speed = 1.0f;
        
        [Frame addAnimation:frameAnim forKey:@"animateOpacity"];
        
        // create an animation for setting opacity to 1
        frameAnim = [CABasicAnimation animationWithKeyPath:@"opacity"];
        frameAnim.duration = 1.0f;
        frameAnim.repeatCount = 0;
        frameAnim.autoreverses = NO;
        
        frameAnim.fromValue = [NSNumber numberWithFloat:1.0];
        frameAnim.toValue = [NSNumber numberWithFloat:1.0];
        
        frameAnim.beginTime = AVCoreAnimationBeginTimeAtZero + CMTimeGetSeconds(gFrames[i].startTime);
        frameAnim.speed = 1.0f;
        
        [Frame addAnimation:frameAnim forKey:@"animateOpacity"];
        
        // create an animation for setting opacity to 0
        frameAnim = [CABasicAnimation animationWithKeyPath:@"opacity"];
        frameAnim.duration = 1.0f;
        frameAnim.repeatCount = 0;
        frameAnim.autoreverses = NO;
        
        frameAnim.fromValue = [NSNumber numberWithFloat:0.0];
        frameAnim.toValue = [NSNumber numberWithFloat:0.0];
        
        frameAnim.beginTime = AVCoreAnimationBeginTimeAtZero + CMTimeGetSeconds(gFrames[i].endTime);
        frameAnim.speed = 1.0f;
        
        [Frame addAnimation:frameAnim forKey:@"animateOpacity"];
        
        // add the frame layer to our parent layer
        [parentLayer addSublayer:Frame];
        
        gError = nil;
        
        // if there's another point after this one
        if( i < gFrames.size()-1)
        {
            // add our video file to the composition with a range of this point's end and the next point's start
            [mutableCompositionTrack insertTimeRange:CMTimeRangeMake(gFrames[i].startTime,
                            CMTimeMake(gFrames[i+1].startTime.value - gFrames[i].startTime.value, 600))
                            ofTrack:gVideoTrack
                            atTime:CMTimeAdd(gFrames[i].startTime, timeOffset) error:&gError];
            
        }
        // else just add our video file with a range of this points end point and the videos duration
        else
        {
            [mutableCompositionTrack insertTimeRange:CMTimeRangeMake(gFrames[i].startTime, CMTimeSubtract(gVideoAsset.duration, gFrames[i].startTime)) ofTrack:gVideoTrack atTime:CMTimeAdd(gFrames[i].startTime, timeOffset) error:&gError];
        }
        
        if(gError)
        {
            char errorMsg[256];
            sprintf(errorMsg, "Error inserting original video segment at: %d", i);
            DebugLog(errorMsg);
            GetError();
        }
    }

Now in that segment the Frame's opacity is set to 0.0f, however when I set it to 1.0f all it does is just place the last one of these frames on top of the video for the entire duration.

After that the vide is exported using an AVAssetExportSession as shown below

mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
    
    // create a layer instruction for our newly created animation tool
    AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:gVideoTrack];
    
    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    [instruction setTimeRange:CMTimeRangeMake(kCMTimeZero, [mutableComposition duration])];
    [layerInstruction setOpacity:1.0f atTime:kCMTimeZero];
    [layerInstruction setOpacity:0.0f atTime:mutableComposition.duration];
    instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
    
    // set the instructions on our videoComposition
    mutableVideoComposition.instructions = [NSArray arrayWithObject:instruction];
    
    // export final composition to a video file
    
    // convert the videopath into a url for our AVAssetWriter to create a file at
    NSString* vidPath = CreateNSString(outputVideoPath);
    NSURL* vidURL = [NSURL fileURLWithPath:vidPath];
    
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPreset1280x720];
    
    exporter.outputFileType = AVFileTypeMPEG4;
    
    exporter.outputURL = vidURL;
    exporter.videoComposition = mutableVideoComposition;
    exporter.timeRange = CMTimeRangeMake(kCMTimeZero, mutableComposition.duration);
    
    // Asynchronously export the composition to a video file and save this file to the camera roll once export completes.
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            if (exporter.status == AVAssetExportSessionStatusCompleted)
            {
                DebugLog("!!!file created!!!");
                _Close();
            }
            else if(exporter.status == AVAssetExportSessionStatusFailed)
            {
                DebugLog("failed damn");
                DebugLog(cStringCopy([[[exporter error] localizedDescription] UTF8String]));
                DebugLog(cStringCopy([[[exporter error] description] UTF8String]));
                _Close();
            }
            else
            {
                DebugLog("NoIdea");
                _Close();
            }
        });
    }];
    
    
}

I get the feeling that the animation is not being started but I don't know. Am I going the right way about this to splice in image data into a video like this?

Any assistance would be greatly appreciated.

like image 896
Tom Haygarth Avatar asked Oct 20 '14 22:10

Tom Haygarth


1 Answers

Well I solved my issue in another way. The animation route was not working, so my solution was to compile all my insertable images into a temporary video file and use that video to insert the images into my final output video.

Starting with the first link I originally posted ASSETWriterInput for making Video from UIImages on Iphone Issues I created the following function to create my temporary video

void CreateFrameImageVideo(NSString* path)
{
    NSLog(@"Creating writer at path %@", path);
    NSError *error = nil;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
                                                              error:&error];

    NSLog(@"Creating video codec settings");
    NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:gVideoTrack.estimatedDataRate/*128000*/], AVVideoAverageBitRateKey,
                                   [NSNumber numberWithInt:gVideoTrack.nominalFrameRate],AVVideoMaxKeyFrameIntervalKey,
                                   AVVideoProfileLevelH264MainAutoLevel, AVVideoProfileLevelKey,
                                   nil];

    NSLog(@"Creating video settings");
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   codecSettings,AVVideoCompressionPropertiesKey,
                                   [NSNumber numberWithInt:1280], AVVideoWidthKey,
                                   [NSNumber numberWithInt:720], AVVideoHeightKey,
                                   nil];

    NSLog(@"Creating writter input");
    AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings] retain];

    NSLog(@"Creating adaptor");
    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];

    [videoWriter addInput:writerInput];

    NSLog(@"Starting session");
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];


    CMTime timeOffset = kCMTimeZero;//CMTimeMake(0, 600);

    NSLog(@"Video Width %d, Height: %d, writing frame video to file", gWidth, gHeight);

    CVPixelBufferRef buffer;

    for(int i = 0; i< gAnalysisFrames.size(); i++)
    {
        while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) {
            NSLog(@"Waiting inside a loop");
            NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
            [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
        }

        //Write samples:
        buffer = pixelBufferFromCGImage(gAnalysisFrames[i].frameImage, gWidth, gHeight);

        [adaptor appendPixelBuffer:buffer withPresentationTime:timeOffset];



        timeOffset = CMTimeAdd(timeOffset, gAnalysisFrames[i].duration);
    }

    while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) {
        NSLog(@"Waiting outside a loop");
        NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
        [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
    }

    buffer = pixelBufferFromCGImage(gAnalysisFrames[gAnalysisFrames.size()-1].frameImage, gWidth, gHeight);
    [adaptor appendPixelBuffer:buffer withPresentationTime:timeOffset];

    NSLog(@"Finishing session");
    //Finish the session:
    [writerInput markAsFinished];
    [videoWriter endSessionAtSourceTime:timeOffset];
    BOOL successfulWrite = [videoWriter finishWriting];

    // if we failed to write the video
    if(!successfulWrite)
    {

        NSLog(@"Session failed with error: %@", [[videoWriter error] description]);

        // delete the temporary file created
        NSFileManager *fileManager = [NSFileManager defaultManager];
        if ([fileManager fileExistsAtPath:path]) {
            NSError *error;
            if ([fileManager removeItemAtPath:path error:&error] == NO) {
                NSLog(@"removeItemAtPath %@ error:%@", path, error);
            }
        }
    }
    else
    {
        NSLog(@"Session complete");
    }

    [writerInput release];

}

After the video is created it is then loaded as an AVAsset and it's track is extracted then the video is inserted by replacing the following line (from the first code block in the original post)

[mutableCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(CMTimeAdd(gFrames[i].startTime, timeOffset), gFrames[i].duration)];

with:

[mutableCompositionTrack insertTimeRange:CMTimeRangeMake(timeOffset,gAnalysisFrames[i].duration)
                                     ofTrack:gFramesTrack
                                     atTime:CMTimeAdd(gAnalysisFrames[i].startTime, timeOffset) error:&gError];

where gFramesTrack is the AVAssetTrack created from the temporary frame video.

all the code relating to CALayer and CABasicAnimation objects have been removed as it just was not working.

Not the most elegant solution, I don't think but one that at least works. I hope that someone finds this useful.

This code also works on iOS devices (tested using an iPad 3)

Side note: The DebugLog function from the first post is just a callback to a function that prints out log messages, they can be replaced with NSLog() calls if need be.

like image 158
Tom Haygarth Avatar answered Sep 19 '22 20:09

Tom Haygarth