Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS CoreVideo Memory Leaks

Can somebody help me trace these CoreVideo memory leaks when running Instruments in XCode?

Basically, the memory leak happens when I press the "Record Video" button on my custom motion jpeg player. I cannot tell exactly which part of my code is leaking as Leaks Instruments is not pointing to any of my calls. BTW, I'm using the iPad device to test the leaks.

Heres the messages from the Leaks Instruments:

  • Responsible Library = CoreVideo
  • Responsible Frame: CVPixelBufferBacking::initWithPixelBufferDescription(..) CVObjectAlloc(...) CVBuffer::init()

Here's my code that handles each motion jpeg frames streamed by the server:

-(void)processServerData:(NSData *)data{    

/*
//render the video in the UIImage control
*/
UIImage *image =[UIImage imageWithData:data];
self.imageCtrl.image = image;

/*
//check if we are recording
*/
if (myRecorder.isRecording) {

    //create initial sample: todo:check if this is still needed
    if (counter==0) {

        self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];
        CVPixelBufferPoolCreatePixelBuffer (NULL, myRecorder.adaptor.pixelBufferPool, &buffer);

        if(buffer) 
        {
            CVBufferRelease(buffer);
        }
    }

    if (counter < myRecorder.maxFrames)
    {
        if([myRecorder.writerInput isReadyForMoreMediaData])
        {
            CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
            CMTime lastTime=CMTimeMake(counter, myRecorder.timeScale); 
            CMTime presentTime=CMTimeAdd(lastTime, frameTime);

            self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];

            [myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];

            if(buffer)
            {
                CVBufferRelease(buffer);
            }

            counter++;

            if (counter==myRecorder.maxFrames)
            {
                [myRecorder finishSession];

                counter=0;
                myRecorder.isRecording = NO;
            }
        }
        else
        {
            NSLog(@"adaptor not ready counter=%d ",counter );
        }
    }
}

}

Here's the pixelBufferFromCGImage function:

+ (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image size:(CGSize) size{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
CVPixelBufferRef pxbuffer = NULL;

CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                      size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                      &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                             size.height, 8, 4*size.width, rgbColorSpace, 
                                             kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                       CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;

}

Aprpeciate any help! Thanks

like image 793
German Avatar asked Nov 06 '22 00:11

German


1 Answers

I refactored the processFrame method and I'm no longer getting the leaks.

-(void) processFrame:(UIImage *) image {

    if (myRecorder.frameCounter < myRecorder.maxFrames)
    {
        if([myRecorder.writerInput isReadyForMoreMediaData])
        {
            CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
            CMTime lastTime=CMTimeMake(myRecorder.frameCounter, myRecorder.timeScale); 
            CMTime presentTime=CMTimeAdd(lastTime, frameTime);

            buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];

            if(buffer)
            {
                [myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];

                myRecorder.frameCounter++;

                CVBufferRelease(buffer);

                if (myRecorder.frameCounter==myRecorder.maxFrames)
                {
                    [myRecorder finishSession];

                    myRecorder.frameCounter=0;
                    myRecorder.isRecording = NO;
                }
            }
            else
            {
                NSLog(@"Buffer is empty");
            }
        }
        else
        {
            NSLog(@"adaptor not ready frameCounter=%d ",myRecorder.frameCounter );
        }
    }

}
like image 105
German Avatar answered Nov 12 '22 16:11

German