Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AVFoundation add first frame to video

I'm trying to control the way that videos produced by my app appear in the Photos app on iOS. All videos that I produce start out with a black frame, then things fade in and out, etc. When these are saved to Photos, Apple takes the first frame (a black square) and uses it as a thumbnail in Photos. I'd like to change this so that I can set my own thumbnail for people to easily recognize the video.

Since I can't find any built in API for this, I'm trying to hack it, by adding a thumbnail I generate as the first frame of the video. I'm trying to use AVFoundation for this, but having some issues.

My code throws the following error: [AVAssetReaderTrackOutput copyNextSampleBuffer] cannot copy next sample buffer before adding this output to an instance of AVAssetReader (using -addOutput:) and calling -startReading on that asset reader', despite having called the method.

Here is my code:

AVAsset *asset = [[AVURLAsset alloc] initWithURL:fileUrl options:nil];
UIImage *frame = [self generateThumbnail:asset];

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:640], AVVideoWidthKey,
                               [NSNumber numberWithInt:360], AVVideoHeightKey,
                               nil];

AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:asset error:nil];
AVAssetReaderOutput *readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[asset.tracks firstObject]
                                                                               outputSettings:nil];
[assetReader addOutput:readerOutput];

AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:path
                                                       fileType:AVFileTypeMPEG4
                                                          error:nil];
NSParameterAssert(videoWriter);

AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                     outputSettings:videoSettings];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                 sourcePixelBufferAttributes:nil];

NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

[videoWriter addInput:writerInput];

[assetReader startReading];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

CVPixelBufferRef buffer = [self pixelBufferFromCGImage:frame.CGImage andSize:frame.size];

BOOL append_ok = NO;
while (!append_ok) {
    if (adaptor.assetWriterInput.readyForMoreMediaData) {
        append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
        CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
        NSParameterAssert(bufferPool != NULL);

        [NSThread sleepForTimeInterval:0.05];
    } else {
        [NSThread sleepForTimeInterval:0.1];
    }
}
CVBufferRelease(buffer);

dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{
    CMSampleBufferRef nextBuffer;
    while (writerInput.readyForMoreMediaData) {
        nextBuffer = [readerOutput copyNextSampleBuffer];
        if(nextBuffer) {
            NSLog(@"Wrote: %zu bytes", CMSampleBufferGetTotalSampleSize(nextBuffer));
            [writerInput appendSampleBuffer:nextBuffer];
        } else {
            [writerInput markAsFinished];
            [videoWriter finishWritingWithCompletionHandler:^{
                //int res = videoWriter.status;
            }];
            break;
        }
    }
}];

I tried some variations on this, but all to no avail. I've seen some crashes due to file format too. I'm using an mp4 file (not sure how to find out its compression status or whether it is supported), but I haven't been able to make it work even with an uncompressed .mov file (made by using Photo Booth on Mac).

Any ideas what I'm doing wrong?

like image 994
adi225 Avatar asked Dec 22 '14 18:12

adi225


1 Answers

Just had the same problem.

Your assetReader gets released after the end of the function by ARC. But the block reading buffer from readerOutput continues trying to read the content.

When assetReader is gone, readerOutput is disconnected from it, hence the error stating that you need to connect it back to an assetReader.

The fix is to make sure that assetReader isn't released. E.g. by putting it within a property.

like image 110
kolinko Avatar answered Oct 04 '22 13:10

kolinko