I am trying to calculate the real world distance of an arbitrary line drawn along the field of view from a one point perspective, single camera setup.
I will have a known distance running parallel. How can I find the compensation factor I need to apply to the pixel length of the measuring line?

Do I have to take into account the distance from the vanishing point, as the length per pixel increases the nearer you get to the vanishing point? Do I need to use the gradient of the known line to give me a rate of change?
A good study on this and similar problems can be found in Antonio Criminisi's papers and Ph.D. thesis on single-view metrology. This is a good link to start, and the whole paperdump is here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With