Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to calculate FOV?

Initial Context

I am developping an augmented reality application location based and I need to get the field of view [FOV] (I just update the value when the orientation change, so I am looking for a method which can get this value when I call it)

The goal is to make a "degree ruler" relevant to reality like the following: Degree Ruler - AR App

I am already using AVCaptureSession to display camera stream ; and a path coupled with a CAShapeLayer to draw the ruler. This is working pretty good, but now I have to use Field of view value to place my element in the right place (choose the right space between 160° and 170° for example!).

Actually, I am hardcoding these values with these sources : https://stackoverflow.com/a/3594424/3198096 (Special thanks to @hotpaw2!) But I am not sure they are fully precise and this is not handling iPhone 5, etc. I was unable to obtain values from official sources (Apple!), but there is a link showing values for all iDevice I think I need (4, 4S, 5, 5S) : AnandTech | Some thoughts about the iphone 5s camera improvements.

Note: After personal test and some other research online, I am pretty sure these values are inaccurate! Also this forces me to use an external library to check which model of iPhone am I using to manually initialize my FOV... And I have to check my values for all supported device.

###I would prefer a "code solution" !###

After reading this post: iPhone: Real-time video color info, focal length, aperture?, I am trying to get exif data from AVCaptureStillImageOutput like suggested. After what I could be able to read the focal length from the exif data, and then calculate the horizontal and vertical field of view via formula! (Or maybe directly obtain the FOV like showed here : http://www.brianklug.org/2011/11/a-quick-analysis-of-exif-data-from-apples-iphone-4s-camera-samples/ -- note: after a certain number of update, It seems that we can't get directly field of view from exif!)


Actual Point

Sources from : http://iphonedevsdk.com/forum/iphone-sdk-development/112225-camera-app-working-well-on-3gs-but-not-on-4s.html and Modified EXIF data doesn't save properly

Here is the code I am using:

AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if (camera != nil) {     captureSession = [[AVCaptureSession alloc] init];          AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];          [captureSession addInput:newVideoInput];          captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];     captureLayer.frame = overlayCamera.bounds;     [captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];     previewLayerConnection=captureLayer.connection;     [self setCameraOrientation:[[UIApplication sharedApplication] statusBarOrientation]];     [overlayCamera.layer addSublayer:captureLayer];     [captureSession startRunning];          AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];     NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];     [stillImageOutput setOutputSettings:outputSettings];     [captureSession addOutput:stillImageOutput];          AVCaptureConnection *videoConnection = nil;     for (AVCaptureConnection *connection in stillImageOutput.connections)     {         for (AVCaptureInputPort *port in [connection inputPorts])         {             if ([[port mediaType] isEqual:AVMediaTypeVideo] )             {                 videoConnection = connection;                 break;             }         }         if (videoConnection) { break; }     }          [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection                                                          completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)      {          NSData *imageNSData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];                    CGImageSourceRef imgSource = CGImageSourceCreateWithData((__bridge_retained CFDataRef)imageNSData, NULL);           NSDictionary *metadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);           NSMutableDictionary *metadataAsMutable = [metadata mutableCopy];                    NSMutableDictionary *EXIFDictionary = [[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy];                   if(!EXIFDictionary)              EXIFDictionary = [[NSMutableDictionary dictionary] init];           [metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];                    NSLog(@"%@",EXIFDictionary);      }]; } 

Here is the output:

{     ApertureValue = "2.52606882168926";     BrightnessValue = "0.5019629837352776";     ColorSpace = 1;     ComponentsConfiguration =     (         1,         2,         3,         0     );     ExifVersion =     (         2,         2,         1     );     ExposureMode = 0;     ExposureProgram = 2;     ExposureTime = "0.008333333333333333";     FNumber = "2.4";     Flash = 16;     FlashPixVersion =     (         1,         0     );     FocalLenIn35mmFilm = 40;     FocalLength = "4.28";     ISOSpeedRatings =     (         50     );     LensMake = Apple;     LensModel = "iPhone 4S back camera 4.28mm f/2.4";     LensSpecification =     (         "4.28",         "4.28",         "2.4",         "2.4"     );     MeteringMode = 5;     PixelXDimension = 1920;     PixelYDimension = 1080;     SceneCaptureType = 0;     SceneType = 1;     SensingMethod = 2;     ShutterSpeedValue = "6.906947890818858";     SubjectDistance = "69.999";     UserComment = "[S.D.] kCGImagePropertyExifUserComment";     WhiteBalance = 0; } 

I think I have everything I need to calculate FOV. But are they the right values? Because after reading a lot of different website giving different focal length values, I am a bit confused! Also my PixelDimensions seems to be wrong!

Via http://en.wikipedia.org/wiki/Angle_of_view this is the formula I planned to use:

FOV = (IN_DEGREES(   2*atan( (d) / (2  * f) )   )); // d = sensor dimensions (mm) // f = focal length (mm) 

My Question

Do my method and my formula look right, and if yes, which values do I pass to the function?


Precisions

  • FOV is what I think I need to use, if you have any suggestion of how the ruler can match reality; I would accept the answer !
  • Zoom is disabled in the augmented reality view controller, so my field of view is fixed when camera is initialized, and can't change until the user rotate the phone!

like image 960
Humbertda Avatar asked Mar 03 '14 08:03

Humbertda


People also ask

How do you calculate FOV distance?

fov = Math. atan((diag) / (3 * distance)) * (180 / Math. PI); So, kindly help me out with the use of the distance value and to calculate the distance value dynamically to the screen size without giving a fixed range.

What is FOV size?

It refers to the coverage of an entire area rather than a single, fixed focal point. FOV also describes the angle through which a person can see the visible world. The wider the FOV, the more one can see of the observable world. It is measured horizontally, vertically and diagonally.

What is the FOV of a 16mm lens?

A 16mm wide-angle lens has a field-of-view of 107 degrees – but a 16mm fisheye has a field-of-view of 180 degrees. They have the same focal length but each one is designed for a different purpose. The 16mm wide-angle is designed to keep straight lines straight.

What is the FOV of a 50mm lens?

The FOV for a 50mm lens would be 39.6 degrees horizontally and 27.0 degrees vertically. Diagonally, the FOV is 46.8 degrees.


2 Answers

In iOS 7 and above you can do something along these lines:

float FOV = camera.activeFormat.videoFieldOfView; 

where camera is your AVCaptureDevice. Depending on what preset you choose for the video session, this can change even on the same device. It's the horizontal field-of-view (in degrees), so you'll need to calculate the vertical field-of-view from the display dimensions.

Here's Apple's reference material.

like image 126
Wildaker Avatar answered Sep 24 '22 18:09

Wildaker


To answer your question:

Do my method and my formula look right...?

Maybe, but they also look too complex.

...and if yes, which values do I pass to the function?

I don't know, but if the goal is to calculate HFOV and VFOV, here is a code example which programmatically finds the Horizontal Viewing Angle, which is the only viewing angle one can access in Swift currently, and then calculates the Vertical Viewing Angle based on the aspect ratio of the iPhone 6, 16:9.

    let devices = AVCaptureDevice.devices()     var captureDevice : AVCaptureDevice?     for device in devices {         if (device.hasMediaType(AVMediaTypeVideo)) {             if(device.position == AVCaptureDevicePosition.Back) {                 captureDevice = device as? AVCaptureDevice             }         }     }     if let retrievedDevice = captureDevice {         var HFOV : Float = retrievedDevice.activeFormat.videoFieldOfView         var VFOV : Float = ((HFOV)/16.0)*9.0     } 

Also remember to import AVFoundation if you want this to work!

like image 38
James Mart Avatar answered Sep 22 '22 18:09

James Mart