IMPORTANT: if I use: session.sessionPreset = AVCaptureSessionPresetHigh;
my preview image is not stretched !! If I save the photo to the device UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
the image is normal, only in the preview it is stretched.
I m using AVFoundation to capture photo.
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
CALayer *viewLayer = vImagePreview.layer;
NSLog(@"viewLayer = %@", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = vImagePreview.bounds;
[vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
I set the sessionPreset to AVCaptureSessionPresetPhoto
:
session.sessionPreset = AVCaptureSessionPresetPhoto;
My capture method:
-(void)captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}
NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(@"attachements: %@", exifAttachments);
} else {
NSLog(@"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(@"%@",NSStringFromCGSize(image.size));
[self animateUpTheImageWithImage:image];
}];
}
Where I add the captured photo's preview:
- (void) animateUpTheImageWithImage:(UIImage*)theImage{
UIView* preview = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height/*426*/)];
CALayer *previewLayer = preview.layer;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = previewLayer.frame;
[previewLayer addSublayer:captureVideoPreviewLayer];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
[self addSubview:preview];
}
And the result is my captured image is stretched !
So I resolved my problem. Here is the code what I'm using now, and it s working fine:
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = vImagePreview.bounds;
[vImagePreview.layer addSublayer:captureVideoPreviewLayer];
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
...
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
capturedImageView = [[UIView alloc]initWithFrame:CGRectMake(0, 0, self.screenWidth,self.screenHeight)];
[self addSubview:capturedImageView];
and important the output imagaView:
vImage = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, self.screenWidth,self.screenHeight)];
[capturedImageView addSubview:vImage];
vImage.autoresizingMask = (UIViewAutoresizingFlexibleBottomMargin|UIViewAutoresizingFlexibleHeight|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleWidth);
vImage.contentMode = UIViewContentModeScaleAspectFill;
vImage.image = theImage;
Some additional infos:
camera layer must be full screen ( You can modify its y coordinate but width and height must be full size ) and the outputImageView must be too.
I hope these will be useful infomations for somebody too.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With