Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is my image not updating when i call it from the capture output protocol?

I am trying to do something very simple. I want to display the video layer in full screen, and once every second update an UIImage with the CMSampleBufferRef i got at that time. However i am running into two different problems. The first one is that changing the:

[connection setVideoMaxFrameDuration:CMTimeMake(1, 1)];
[connection setVideoMinFrameDuration:CMTimeMake(1, 1)];

Will also modify the video preview layer, I thought it would only modify the rate at where av foundation sends the information to the delegate but it seems to affect the entire session (which looks more obvious). So this makes my video update every second. I guess i could omit those lines and simply add a timer in the delegate so that every second it sends the CMSampleBufferRef to another method to process it. But i dunno if this is the right approach.

My second problem is that the UIImageView is NOT updating, or sometimes it just updates once and doesn't change after. I am using this method to update it:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {
    //NSData *jpeg = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:sampleBuffer] ;
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    [imageView setImage:image];
    // Add your code here that uses the image.
    NSLog(@"update");
}

Which i took from the apple examples. The method is being called correctly every second which i checked by reading the update message. But the image is not changing at all. Also is the sampleBuffer automatically destroyed or do i have to release it?

This are the other 2 important methods: View Did Load:

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.

    session = [[AVCaptureSession alloc] init];

    // Add inputs and outputs.
    if ([session canSetSessionPreset:AVCaptureSessionPreset640x480]) {
        session.sessionPreset = AVCaptureSessionPreset640x480;
    }
    else {
        // Handle the failure.
        NSLog(@"Cannot set session preset to 640x480");
    }

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];

    if (!input) {
        // Handle the error appropriately.
        NSLog(@"Could create input: %@", error);
    }

    if ([session canAddInput:input]) {
        [session addInput:input];
    }
    else {
        // Handle the failure.
        NSLog(@"Could not add input");
    }

    // DATA OUTPUT
    dataOutput = [[AVCaptureVideoDataOutput alloc] init];

    if ([session canAddOutput:dataOutput]) {
        [session addOutput:dataOutput];

        dataOutput.videoSettings = 
        [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                    forKey: (id)kCVPixelBufferPixelFormatTypeKey];
        //dataOutput.minFrameDuration = CMTimeMake(1, 15);
        //dataOutput.minFrameDuration = CMTimeMake(1, 1);
        AVCaptureConnection *connection = [dataOutput connectionWithMediaType:AVMediaTypeVideo];

        [connection setVideoMaxFrameDuration:CMTimeMake(1, 1)];
        [connection setVideoMinFrameDuration:CMTimeMake(1, 1)];

    }
    else {
        // Handle the failure.
        NSLog(@"Could not add output");
    }
    // DATA OUTPUT END

    dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
    [dataOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);


    captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

    [captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];

    [captureVideoPreviewLayer setBounds:videoLayer.layer.bounds];
    [captureVideoPreviewLayer setPosition:videoLayer.layer.position];

    [videoLayer.layer addSublayer:captureVideoPreviewLayer];

    [session startRunning];
}

Covert the CMSampleBufferRef to UIImage:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

Thanks in advance for any help you can give me.

like image 849
Pochi Avatar asked Feb 21 '23 07:02

Pochi


1 Answers

From the documentation for the captureOutput:didOutputSampleBuffer:fromConnection: method :

This method is called on the dispatch queue specified by the output’s sampleBufferCallbackQueue property.

This means that if you need to update the UI using the buffer in this method you need to do that on the main queue like this :

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {

    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    dispatch_async(dispatch_get_main_queue(), ^{
        [imageView setImage:image];
    });
}

EDIT : About your first questions : I'm not sure I'm understanding the problem, but if you want to update the image only once every second you can also have a "lastImageUpdateTime" value to compare to in the "didOutputSampleBuffer" method and see if enough time passed and only update the image there, and ignore the sample buffer otherwise.

like image 189
adig Avatar answered Apr 07 '23 12:04

adig