Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is my QTKit based image encoding application so slow?

in a cocoa application I'm currently coding, I'm getting snapshot images from a Quartz Composer renderer (NSImage objects) and I would like to encode them in a QTMovie in 720*480 size, 25 fps, and H264 codec using the addImage: method. Here is the corresponding piece of code:

qRenderer = [[QCRenderer alloc] initOffScreenWithSize:NSMakeSize(720,480) colorSpace:CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB) composition:[QCComposition compositionWithFile:qcPatchPath]]; // define an "offscreen" Quartz composition renderer with the right image size


imageAttrs = [NSDictionary dictionaryWithObjectsAndKeys: @"avc1", // use the H264 codec
              QTAddImageCodecType, nil];

qtMovie = [[QTMovie alloc] initToWritableFile: outputVideoFile error:NULL]; // initialize the output QT movie object

long fps = 25;
frameNum = 0;

NSTimeInterval renderingTime = 0;
NSTimeInterval frameInc = (1./fps);
NSTimeInterval myMovieDuration = 70;
NSImage * myImage;
while (renderingTime <= myMovieDuration){
    if(![qRenderer renderAtTime: renderingTime arguments:NULL])
        NSLog(@"Rendering failed at time %.3fs", renderingTime);
    myImage = [qRenderer snapshotImage];
    [qtMovie addImage:myImage forDuration: QTMakeTimeWithTimeInterval(frameInc) withAttributes:imageAttrs];
    [myImage release];
    frameNum ++;
    renderingTime = frameNum * frameInc;
}
[qtMovie updateMovieFile];
[qRenderer release];
[qtMovie release]; 

It works, however my application is not able to do that in real time on my new MacBook Pro, while I know that QuickTime Broadcaster can encode images in real time in H264 with an even higher quality that the one I use, on the same computer.

So why ? What's the issue here? Is this a hardware management issue (multi-core threading, GPU,...) or am I missing something? Let me preface that I'm new (2 weeks of practice) in the Apple development world, both in objective-C, cocoa, X-code, Quicktime and Quartz Composer libraries, etc.

Thanks for any help

like image 405
mben Avatar asked May 17 '11 15:05

mben


1 Answers

AVFoundation is a more efficient way to render a QuartzComposer animation to an H.264 video stream.


size_t width = 640;
size_t height = 480;

const char *outputFile = "/tmp/Arabesque.mp4";

QCComposition *composition = [QCComposition compositionWithFile:@"/System/Library/Screen Savers/Arabesque.qtz"];
QCRenderer *renderer = [[QCRenderer alloc] initOffScreenWithSize:NSMakeSize(width, height)
                                                      colorSpace:CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB) composition:composition];

unlink(outputFile);
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:@(outputFile)] fileType:AVFileTypeMPEG4 error:NULL];

NSDictionary *videoSettings = @{ AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : @(width), AVVideoHeightKey : @(height) };
AVAssetWriterInput* writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

[videoWriter addInput:writerInput];
[writerInput release];

AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:NULL];

int framesPerSecond = 30;
int totalDuration = 30;
int totalFrameCount = framesPerSecond * totalDuration;

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

__block long frameNumber = 0;

dispatch_queue_t workQueue = dispatch_queue_create("com.example.work-queue", DISPATCH_QUEUE_SERIAL);

NSLog(@"Starting.");
[writerInput requestMediaDataWhenReadyOnQueue:workQueue usingBlock:^{
    while ([writerInput isReadyForMoreMediaData]) {
        NSTimeInterval frameTime = (float)frameNumber / framesPerSecond;
        if (![renderer renderAtTime:frameTime arguments:NULL]) {
            NSLog(@"Rendering failed at time %.3fs", frameTime);
            break;
        }

        CVPixelBufferRef frame = (CVPixelBufferRef)[renderer createSnapshotImageOfType:@"CVPixelBuffer"];
        [pixelBufferAdaptor appendPixelBuffer:frame withPresentationTime:CMTimeMake(frameNumber, framesPerSecond)];
        CFRelease(frame);

        frameNumber++;
        if (frameNumber >= totalFrameCount) {
            [writerInput markAsFinished];
            [videoWriter finishWriting];
            [videoWriter release];
            [renderer release];
            NSLog(@"Rendered %ld frames.", frameNumber);
            break;
        }

    }
}];

In my testing this is around twice as fast as your posted code that uses QTKit. The biggest improvement appears to come from the H.264 encoding being handed off to the GPU rather than being performed in software. From a quick glance at a profile it appears that the remaining bottlenecks are the rendering of the composition itself, and reading the rendered data back from the GPU in to a pixel buffer. Obviously the complexity of your composition will have some impact on this.

It may be possible to further optimize this by using QCRenderer's ability to provide snapshots as CVOpenGLBufferRefs, which may keep the frame's data on the GPU rather than reading it back to hand it off to the encoder. I didn't look too far in to that though.

like image 120
bdash Avatar answered Sep 27 '22 17:09

bdash