Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Cropping a captured image exactly to how it looks in AVCaptureVideoPreviewLayer

I have a photo app that is using AV Foundation. I have setup a preview layer using AVCaptureVideoPreviewLayer that takes up the top half of the screen. So when the user is trying to take their photo, all they can see is what the top half of the screen sees.

This works great, but when the user actually takes the photo and I try to set the photo as the layer's contents, the image is distorted. I did research and realized that I would need to crop the image.

All I want to do is crop the full captured image so that all that is left is exactly what the user could originally see in the top half of the screen.

I have been able to sort-of accomplish this but I am doing this by entering in manual CGRect values and it still does not look perfect. There has to be an easier way to do this.

I have literally gone through every post on stack overflow for the past 2 days about cropping images and nothing has worked.

There has to be a way to programmatically crop the captured image so that the final image will be exactly what was originally seen in the preview layer.

Here is my viewDidLoad implementation:

- (void)viewDidLoad
{
    [super viewDidLoad];

    AVCaptureSession *session =[[AVCaptureSession alloc]init];
    [session setSessionPreset:AVCaptureSessionPresetPhoto];

    AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = [[NSError alloc]init];
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];

    if([session canAddInput:deviceInput])
        [session addInput:deviceInput];

    CALayer *rootLayer = [[self view]layer];
    [rootLayer setMasksToBounds:YES];

    _previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
    [_previewLayer setFrame:CGRectMake(0, 0, rootLayer.bounds.size.width, rootLayer.bounds.size.height/2)];
    [_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    [rootLayer insertSublayer:_previewLayer atIndex:0];

    _stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    [session addOutput:_stillImageOutput];

    [session startRunning];
    }

And here is the code that runs when the user presses the button to capture a photo:

-(IBAction)stillImageCapture {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in _stillImageOutput.connections){
        for (AVCaptureInputPort *port in [connection inputPorts]){
            if ([[port mediaType] isEqual:AVMediaTypeVideo]){
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) {
            break;
        }
    }

    NSLog(@"about to request a capture from: %@", _stillImageOutput);

    [_stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
        if(imageDataSampleBuffer) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];

            UIImage *image = [[UIImage alloc]initWithData:imageData];
            CALayer *subLayer = [CALayer layer];
            subLayer.frame = _previewLayer.frame;
            image = [self rotate:image andOrientation:image.imageOrientation];

            //Below is the crop that is sort of working for me, but as you can see I am manually entering in values and just guessing and it still does not look perfect.
            CGRect cropRect = CGRectMake(0, 650, 3000, 2000);
            CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);

            subLayer.contents = (id)[UIImage imageWithCGImage:imageRef].CGImage;
            subLayer.frame = _previewLayer.frame;

            [_previewLayer addSublayer:subLayer];
        }
    }];
}
like image 334
user3117509 Avatar asked Feb 21 '14 02:02

user3117509


2 Answers

Have a look at AVCaptureVideoPreviewLayer s

-(CGRect)metadataOutputRectOfInterestForRect:(CGRect)layerRect

This method lets you easily convert the visible CGRect of your layer to the actual camera output.

One caveat: The physical camera is not mounted "top side up", but rather rotated 90 degrees clockwise. (So if you hold your iPhone - Home Button right, the camera is actually top side up).

Keeping this in mind, you have to convert the CGRect the above method gives you, to crop the image to exactly what is on screen.

Example:

CGRect visibleLayerFrame = THE ACTUAL VISIBLE AREA IN THE LAYER FRAME
CGRect metaRect = [self.previewView.layer metadataOutputRectOfInterestForRect:visibleLayerFrame];


CGSize originalSize = [originalImage size];

if (UIInterfaceOrientationIsPortrait(_snapInterfaceOrientation)) {
    // For portrait images, swap the size of the image, because
    // here the output image is actually rotated relative to what you see on screen.

    CGFloat temp = originalSize.width;
    originalSize.width = originalSize.height;
    originalSize.height = temp;
}


// metaRect is fractional, that's why we multiply here

CGRect cropRect;

cropRect.origin.x = metaRect.origin.x * originalSize.width;
cropRect.origin.y = metaRect.origin.y * originalSize.height;
cropRect.size.width = metaRect.size.width * originalSize.width;
cropRect.size.height = metaRect.size.height * originalSize.height;

cropRect = CGRectIntegral(cropRect);

This may be a bit confusing, but what made me really understand it is this:

Hold your device "Home Button right" -> You'll see the x - axis actually lies along the "height" of your iPhone, while the y - axis lies along the "width" of your iPhone. That's why for portrait images, you have to swap the size ;)

like image 62
CodingMeSwiftly Avatar answered Oct 27 '22 00:10

CodingMeSwiftly


@Cabus has a solution that works and you should up-vote his answer. However, I did my own version in Swift with the following:

// The image returned in initialImageData will be larger than what
//  is shown in the AVCaptureVideoPreviewLayer, so we need to crop it.
let image : UIImage = UIImage(data: initialImageData)!

let originalSize : CGSize
let visibleLayerFrame = self.previewView!.bounds // THE ACTUAL VISIBLE AREA IN THE LAYER FRAME

// Calculate the fractional size that is shown in the preview
let metaRect : CGRect = (self.videoPreviewLayer?.metadataOutputRectOfInterestForRect(visibleLayerFrame))!
if (image.imageOrientation == UIImageOrientation.Left || image.imageOrientation == UIImageOrientation.Right) {
    // For these images (which are portrait), swap the size of the
    // image, because here the output image is actually rotated
    // relative to what you see on screen.
    originalSize = CGSize(width: image.size.height, height: image.size.width)
}
else {
    originalSize = image.size
}

// metaRect is fractional, that's why we multiply here.
let cropRect : CGRect = CGRectIntegral(
        CGRect( x: metaRect.origin.x * originalSize.width,
                y: metaRect.origin.y * originalSize.height,
                width: metaRect.size.width * originalSize.width,
                height: metaRect.size.height * originalSize.height))

let finalImage : UIImage = 
    UIImage(CGImage: CGImageCreateWithImageInRect(image.CGImage, cropRect)!, 
        scale:1, 
        orientation: image.imageOrientation )
like image 41
Erik Allen Avatar answered Oct 27 '22 00:10

Erik Allen