I would like to draw an artificial horizon. The center of the view would represent perfectly horizontal view with roll rotating the horizontal line and pitch moving it up or down.
The question is: what is the correct calculation to translate the horizon line up or down (pitch) given the pitch angle.
My guess is that this would probably depend on the FOV angle that one would assume for an assumed camera, so this angle would need to be a factor in the algorithm sought. Ideally I would figure out this angle for the iPhone/iPad camera so that the artificial horizon would line up with the actual horizon if you hold the device in front of you and look towards the horizon.
Until now I've been guesstimating the offset, but I would like to have the exact formula.
Try horizon_offset/(screen_height/2)=tan(pitch)/tan(vertical_FOV/2).
Look at the picture, and the formula derives itself.
(source: zwibbler.com)
.
Update I have two angles mixed up. One is the FOV angle of the camera, the other is the viewing angle of the screen. These are two different things. The latter depends on the viewing distance. You probably have to estimate this distance, and adjust magnification and/or focal distance such that objects visible on the screen are the same angular size as the same objects visible with the naked eye. (With my particular phone, you would need to magnify the image by an additional factor of about 3 after the 5x zoom, if the user stretches his hand with the phone all the way forward). Then the two angles are the same, and the formula works.
If you want to introduce magnification (i.e. objects on the screen have different sizes from their real-life counterparts), multiply the horizon offset by the magnification factor.
Update 2 When taking the viewing distance into account, the screen size cancels out, and the offset simply becomes viewing_distance*tan(pitch_angle)
(with unit magnification).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With