Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AvMutableComposition issues having black frame at the end

I capturing a video using AVCaptureConnection in my iOS app. After that I add some images in the video as CALayers. Everything is working fine but I get a black frame at the very end of the resulting video after adding images. There is no frame of actual audio/video that has been affected in this. For audio I am extracting it and changing its pitch and then add it using AVMutableComposition. Here is the code that I am using. Please help me with what I am doing wrong or do I need to add something else.

cmp = [AVMutableComposition composition];

    AVMutableCompositionTrack *videoComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *audioComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    AVAssetTrack *sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVAssetTrack *sourceAudioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];


    [videoComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;
    [audioComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil];

    animComp = [AVMutableVideoComposition videoComposition];
    animComp.renderSize = CGSizeMake(320, 320);
    animComp.frameDuration = CMTimeMake(1,30);
    animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer  inLayer:parentLayer];

    // to gather the audio part of the video
    NSArray *tracksToDuck = [cmp tracksWithMediaType:AVMediaTypeAudio];
    NSMutableArray *trackMixArray = [NSMutableArray array];
    for (NSInteger i = 0; i < [tracksToDuck count]; i++) {
        AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck objectAtIndex:i]];
        [trackMix setVolume:5 atTime:kCMTimeZero];
        [trackMixArray addObject:trackMix];
    }
    audioMix = [AVMutableAudioMix audioMix];
    audioMix.inputParameters = trackMixArray;

    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);
    AVMutableVideoCompositionLayerInstruction *layerVideoInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoComposition];

    [layerVideoInstruction setOpacity:1.0 atTime:kCMTimeZero];
    instruction.layerInstructions = [NSArray arrayWithObject:layerVideoInstruction] ;
    animComp.instructions = [NSArray arrayWithObject:instruction];
    [self exportMovie:self];

This is my method for exporting the video

-(IBAction) exportMovie:(id)sender{

    //successCheck = NO;
    NSArray *docPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *tempPath = [docPaths objectAtIndex:0];
    //NSLog(@"Temp Path: %@",tempPath);

    NSString *fileName = [NSString stringWithFormat:@"%@/Final.MP4",tempPath];
    NSFileManager *fileManager = [NSFileManager defaultManager] ;
    if([fileManager fileExistsAtPath:fileName ]){
        NSError *ferror = nil ;
        [fileManager removeItemAtPath:fileName error:&ferror];
    }

    NSURL *exportURL = [NSURL fileURLWithPath:fileName];

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:cmp presetName:AVAssetExportPresetMediumQuality]  ;
    exporter.outputURL = exportURL;
    exporter.videoComposition = animComp;
    //exporter.audioMix = audioMix;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;

    [exporter exportAsynchronouslyWithCompletionHandler:^(void){
        switch (exporter.status) {
            case AVAssetExportSessionStatusFailed:{
                NSLog(@"Fail");
                break;
            }
            case AVAssetExportSessionStatusCompleted:{
                NSLog(@"Success video");
                                });
                break;
            }

            default:
                break;
        }
           }];
    NSLog(@"outside");
}
like image 330
Swati Avatar asked Apr 05 '13 12:04

Swati


2 Answers

there is a property of exportsession to give the time range , try giving time range little less than the actual time (few nano seconds less)

like image 68
Khushboo Avatar answered Oct 04 '22 10:10

Khushboo


You can get the true video duration from AVAssetTrack. The duration of AVAsset is sometimes longer than AVAssetTrack' one.

Check durations out like this.

print(asset.duration.seconds.description)
print(videoTrack.timeRange.duration.description)

So you can change this line.

[videoComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;

To like this.

 [videoComposition insertTimeRange:sourceVideoTrack.timeRange, ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;

For swift 5

videoComposition.insertTimeRange(sourceVideoTrack.timeRange, of: sourceVideoTrack, at: CMTime.zero)

Then you will avoid the last black frame :)

Hope this helps someone still suffer.

like image 36
Tomo Avatar answered Oct 04 '22 09:10

Tomo