I am trying to create a camera app which, would act like the default camera app more or less. The thing, which is not working for me at the moment, is tap to focus. I want the camera to focus and do whatever it does on my touched point, just like the real camera app does.
Here's my viewDidLoad
- (void)viewDidLoad
{
[super viewDidLoad];
// Session
_session = [[AVCaptureSession alloc] init];
_session.sessionPreset = AVCaptureSessionPresetPhoto;
// Input
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
_videoInput = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:nil];
// Output
_frameOutput = [[AVCaptureVideoDataOutput alloc] init];
_frameOutput.videoSettings = [NSDictionary dictionaryWithObject:AVVideoCodecJPEG forKey:AVVideoCodecKey];
[_frameOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[_session addInput:_videoInput];
[_session addOutput:_frameOutput];
[_session startRunning];
};
And here's the method that should make my camera focus stuff on click.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[touches enumerateObjectsUsingBlock:^(id obj, BOOL *stop) {
UITouch *touch = obj;
CGPoint touchPoint = [touch locationInView:touch.view];
focusLayer.frame = CGRectMake((touchPoint.x-25), (touchPoint.y-25), 50, 50);
if ([_videoDevice isFocusPointOfInterestSupported]) {
NSError *error;
if ([_videoDevice lockForConfiguration:&error]) {
[_videoDevice setFocusPointOfInterest:touchPoint];
[_videoDevice setExposurePointOfInterest:touchPoint];
[_videoDevice setFocusMode:AVCaptureFocusModeAutoFocus];
if ([_videoDevice isExposureModeSupported:AVCaptureExposureModeAutoExpose]){
[_videoDevice setExposureMode:AVCaptureExposureModeAutoExpose];
}
[_videoDevice unlockForConfiguration];
}
}
// NSLog(@"x = %f, y = %f", touchPoint.x, touchPoint.y);
}];
}
Nothing really happens once I click on the screen.
You have to adjust the touchPoint to a range of [0,1] using something like the following code:
CGRect screenRect = [[UIScreen mainScreen] bounds];
screenWidth = screenRect.size.width;
screenHeight = screenRect.size.height;
double focus_x = thisFocusPoint.center.x/screenWidth;
double focus_y = thisFocusPoint.center.y/screenHeight;
[[self captureManager].videoDevice lockForConfiguration:&error];
[[self captureManager].videoDevice setFocusPointOfInterest:CGPointMake(focus_x,focus_y)];
[[self captureManager].videoDevice unlockForConfiguration];
The documentation on this can be found in Apple - AV Foundation Programming Guidelines - see section Media Capture where you will find information on Focus Modes:
If it’s supported, you set the focal point using focusPointOfInterest. You pass a CGPoint where {0,0} represents the top left of the picture area, and {1,1} represents the bottom right in landscape mode with the home button on the right—this applies even if the device is in portrait mode.
UITapGestureRecognizer *shortTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(handleTapToFocus:)];
shortTap.numberOfTapsRequired=1;
shortTap.numberOfTouchesRequired=1;
[viewCanvasRecording addGestureRecognizer:shortTap];
and then this:
- (void)handleTapToFocus:(UITapGestureRecognizer *)tapGesture
{
AVCaptureDevice *acd=!currentFrontCamera ? captureBackInput.device : captureFrontInput.device;
if (tapGesture.state == UIGestureRecognizerStateEnded)
{
CGPoint thisFocusPoint = [tapGesture locationInView:viewCanvasRecording];
double focus_x = thisFocusPoint.x/viewCanvasRecording.frame.size.width;
double focus_y = thisFocusPoint.y/viewCanvasRecording.frame.size.height;
if ([acd isFocusModeSupported:AVCaptureFocusModeAutoFocus] && [acd isFocusPointOfInterestSupported])
{
if ([acd lockForConfiguration:nil])
{
[acd setFocusMode:AVCaptureFocusModeAutoFocus];
[acd setFocusPointOfInterest:CGPointMake(focus_x, focus_y)];
/*
if ([acd isExposureModeSupported:AVCaptureExposureModeAutoExpose] && [acd isExposurePointOfInterestSupported])
{
[acd setExposureMode:AVCaptureExposureModeAutoExpose];
[acd setExposurePointOfInterest:CGPointMake(focus_x, focus_y)];
}*/
[acd unlockForConfiguration];
}
}
}
}
A Swift version:
@IBAction func tapToFocus(_ sender: UITapGestureRecognizer) {
if (sender.state == .ended) {
let thisFocusPoint = sender.location(in: previewView)
print("touch to focus ", thisFocusPoint)
let focus_x = thisFocusPoint.x / previewView.frame.size.width
let focus_y = thisFocusPoint.y / previewView.frame.size.height
if (captureDevice!.isFocusModeSupported(.autoFocus) && captureDevice!.isFocusPointOfInterestSupported) {
do {
try captureDevice?.lockForConfiguration()
captureDevice?.focusMode = .autoFocus
captureDevice?.focusPointOfInterest = CGPoint(x: focus_x, y: focus_y)
if (captureDevice!.isExposureModeSupported(.autoExpose) && captureDevice!.isExposurePointOfInterestSupported) {
captureDevice?.exposureMode = .autoExpose;
captureDevice?.exposurePointOfInterest = CGPoint(x: focus_x, y: focus_y);
}
captureDevice?.unlockForConfiguration()
} catch {
print(error)
}
}
}
}
So here is how I handle gestures for my AV camera preview. Setup your UITapGestureRecognizer first, then get the point by using captureDevicePointOfInterestForPoint.
- (void)setupGestures
{
UITapGestureRecognizer *tapToFocusRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self
action:@selector(handleTapToFocusAndExposureRecognizer:)];
[self addGestureRecognizer:tapToFocusRecognizer];
}
- (void)handleTapToFocusAndExposureRecognizer:(UITapGestureRecognizer*)tabRecognizer {
CGPoint touchPoint = [tabRecognizer locationInView:self];
CGPoint point = [self.previewLayer captureDevicePointOfInterestForPoint:touchPoint];
AVCaptureDevice *device = [self.videoCaptureDeviceInput device];
NSError *error = nil;
if (tabRecognizer.state == UIGestureRecognizerStateEnded) {
if (![device lockForConfiguration:&error]) {
if (error) {
RCTLogError(@"%s: %@", __func__, error);
}
return;
}
[device setFocusPointOfInterest:CGPointMake(point.x, point.y)];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device setExposurePointOfInterest:CGPointMake(point.x, point.y)];
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[device unlockForConfiguration];
}
}
I'm using AVCaptureVideoPreviewLayer to get the touch point. But if you use GLKView to render preview instead of AVCaptureVideoPreviewLayer, you can't get the point directly but have to get the point as previous answer did.
New coder to ios development. Hope this can help.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With