I want to be able to calculate the coordinates of a 3D point, based on its distance from the origin, and two angles: "yaw" around Y-axis, and "pitch" around the X-axis.
In the example, distance from the origin would be 50 units, yaw 10 degrees, and pitch 10 degrees.
Is there a formula to retrieve the 3D result point?
If your starting point is (0,0), and your new point is r units away at an angle of θ, you can find the coordinates of that point using the equations x = r cosθ and y = r sinθ.
Divide the height of the object by the tangent of the angle. For this example, let's say the height of the object in question is 150 feet. 150 divided by 1.732 is 86.603. The horizontal distance from the object is 86.603 feet.
If you have a point which is defined by an azimuth angle (yaw
), an altitude angle (pitch
) and a distance along this direction vector, then you have to transform the azimuth angle (yaw
) and altitude angle (pitch
) to a unit direction vector first .
See Solar zenith angle, Azimuth or Euler angles.
In an coordinate system, where the x axis points to the left and the z axis to the front and the y axis is the up vector (Lefthanded coordiante system), this can be calculated as follows:
x = sin(yaw) * cos(pitch)
y = sin(pitch)
z = cos(yaw) * cos(pitch)
where yaw
is the clockwise angle between the z axis and the vector to the point (projected to the XZ plane).
This direction has to be multiplied b the distance to the origin:
P = distance * (x, y, z);
or
Px = distance * sin(yaw) * cos(pitch)
Py = distance * sin(pitch)
Pz = distance * cos(yaw) * cos(pitch)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With