Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS Swift: Trying to use write image buffer to construct video, finishWritingWithCompletionHandler not called. Output video contains zero bytes

I'm trying to just write only two frames from a static image to construct a video. I've been mugging around the time parameters a bit. Seems the last step finishWritingWithCompletionHandler never been called (finished writing... never get outputted). There is only a zero byte .mp4 video created. And there is no errors occurred. Can't figure out why. Here's the code that I use:

func createBackgroundVideo(CompletionHandler: (path: String)->Void) {

    var maybeError: NSError?
    let fileMgr = NSFileManager.defaultManager()
    let docDirectory = NSHomeDirectory().stringByAppendingPathComponent("Documents")
    let videoOutputPath = docDirectory.stringByAppendingPathComponent(BgVideoName)

    if (!fileMgr.removeItemAtPath(videoOutputPath, error: &maybeError)) {
        NSLog("Umable to delete file: %@", maybeError!.localizedDescription)
    }

    println(videoOutputPath)

    let videoWriter = AVAssetWriter(
        URL: NSURL(fileURLWithPath: videoOutputPath),
        fileType: AVFileTypeQuickTimeMovie,
        error: &maybeError
    )

    var videoSettings = [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoWidthKey: NSNumber(float: Float(videoWidth)),
        AVVideoHeightKey: NSNumber(float: Float(videoHeight))
    ]

    var avAssetInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
    avAssetInput.expectsMediaDataInRealTime = true

    var adaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: avAssetInput, sourcePixelBufferAttributes: nil)

    videoWriter.addInput(avAssetInput)
    videoWriter.startWriting()
    videoWriter.startSessionAtSourceTime(kCMTimeZero)

    var frameCount: Int64 = 0;
    var buffer: CVPixelBufferRef

    //buffer = PixelBuffer.pixelBufferFromCGImage2(self.bgImage.CGImage, andSize: CGSizeMake(videoWidth, videoHeight)).takeUnretainedValue()

    for i in 1...2 {
        buffer = PixelBuffer.pixelBufferFromCGImage2(self.bgImage.CGImage, andSize: CGSizeMake(videoWidth, videoHeight)).takeUnretainedValue()
        var appendOk = false
        var retries: Int = 0

        while (!appendOk && retries < 30) {
            if (adaptor.assetWriterInput.readyForMoreMediaData) {
                let frameTime = CMTimeMake(frameCount, 1);
                appendOk = adaptor.appendPixelBuffer(buffer, withPresentationTime: frameTime)
                if (!appendOk) {
                    println("some erorr occurred", videoWriter.error)
                } else {
                    println("pixel written")
                }
            } else {
                println("adaptor is not ready....")
                NSThread.sleepForTimeInterval(0.1)
            }
            retries++
        }

        if (!appendOk) {
            println("Error appending image....")
        }

        frameCount++
    }

    avAssetInput.markAsFinished()
    videoWriter.finishWritingWithCompletionHandler({() -> Void in
        println("finished writing...")
        CompletionHandler(path: videoOutputPath)
    })
}

I'm calling a pixel buffer from CGImage method written in Obj-c, (I've added headers and bridging headers, it seems working fine):

+ (CVPixelBufferRef) pixelBufferFromCGImage2: (CGImageRef) image andSize:(CGSize) size {

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          size.width,
                                          size.height,
                                          kCVPixelFormatType_32ARGB,
                                          (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    if (status != kCVReturnSuccess){
        NSLog(@"Failed to create pixel buffer");
    }

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                                 size.height, 8, 4*size.width, rgbColorSpace,
                                                 kCGImageAlphaPremultipliedFirst);

    float offsetY = size.height / 2 - CGImageGetHeight(image) / 2;
    float offsetX = size.width / 2 - CGImageGetWidth(image) / 2;

    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(offsetX, offsetY, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

Thanks for reading.

like image 765
Jeremy Lu Avatar asked Dec 05 '25 19:12

Jeremy Lu


1 Answers

Maybe your videoSettings dictionary is not complete. Try setting up some more atom information like this:

var videoCleanApertureSettings = [AVVideoCleanApertureWidthKey:Int(self.width),
                                 AVVideoCleanApertureHeightKey:Int(self.height),
                       AVVideoCleanApertureHorizontalOffsetKey:0,
                         AVVideoCleanApertureVerticalOffsetKey:0]

var videoAspectRatioSettings = [AVVideoPixelAspectRatioHorizontalSpacingKey:1,
                                  AVVideoPixelAspectRatioVerticalSpacingKey:1]

var codecSettings = [AVVideoCleanApertureKey:videoCleanApertureSettings,
                  AVVideoPixelAspectRatioKey:videoAspectRatioSettings]

var videoSettings = [AVVideoCodecKey:AVVideoCodecH264,
     AVVideoCompressionPropertiesKey:codecSettings,
                     AVVideoWidthKey:Int(self.width),
                    AVVideoHeightKey:Int(self.height)]

You start the video at timestamp zero. That's okay:

[self.videoWriter startSessionAtSourceTime:kCMTimeZero];

Maybe your timestamps of your video images are not distant enough to see something. If you need some seconds for displaying the images you can do something like this:

int64_t newFrameNumber = (uint64_t)(presentationTimeInSeconds * 60.);
CMTime frameTime = CMTimeMake(newFrameNumber, 60);

Using 60 as timescale gives you the opportunity using seconds as unit with a good resolution.

For producing a slideshow in "realtime", you can use NSDate to encode the timestamp:

int64_t newFrameNumber = (uint64_t)(fabs([self.videoStartDate timeIntervalSinceNow]) * 60.);

where self.videoStartDate is an [NSDate date] value which you set immediately after starting the video.

The CMTime tells the decoder when to display the image, not for how long to display it. You start with a frameCount value of 0 which tells the decoder to immediately presenting the first image. Maybe you try starting with 1 to see if the video will display the first image a little bit later.

if you used startSessionAtSourceTime then you must end the video with endSessionAtSourceTime before you call finishWritingWithCompletionHandler, else the closure may not be called. Pass the last timestamp to endSessionAtSourceTime.

You could try the deprecated method from apple to see if this is maybe a bug. After marking as finished call

videoWriter.finishWriting()

instead of finishWritingWithCompletionHandler and wait a little bit for the file to be closed by the diskwriter. (i.e. by using a dispatch queue)

int64_t delayInSeconds = 1;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
dispatch_after(popTime, dispatch_get_main_queue(), ^(void){

     // call your completion handler after the file has been written
})

Here is the swift version:

let delayInSeconds:Double = 0.5
let popTime = dispatch_time(DISPATCH_TIME_NOW, Int64(delayInSeconds * Double(NSEC_PER_SEC)))
dispatch_after(popTime, dispatch_get_main_queue(), {

   println("finished writing...")
   CompletionHandler(path: videoOutputPath)
})

Maybe your videowriter instance doesn't exist anymore after leaving your class. (The block is called asynchronously, but you declared videowriter locally in your function. ARC is maybe releasing the object before the completion handler can be called.) Declare the writers globally to fix this issue.

Hint:

Keep your CGColorSpace in memory (i.e. creating a class var or a static var here), because CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); takes a long time to initialize. Doing this only once before you encode the video will increase your apps execution speed dramatically!

like image 69
JackPearse Avatar answered Dec 08 '25 09:12

JackPearse



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!