I am having problems implementing the function described here here.
This is my Java implementation:
private static double[] pointRadialDistance(double lat1, double lon1,
double radianBearing, double radialDistance) {
double lat = Math.asin(Math.sin(lat1)*Math.cos(radialDistance)+Math.cos(lat1)
*Math.sin(radialDistance)*Math.cos(radianBearing));
double lon;
if(Math.cos(lat) == 0) { // Endpoint a pole
lon=lon1;
}
else {
lon = ((lon1-Math.asin(Math.sin(radianBearing)*Math.sin(radialDistance)/Math.cos(lat))
+Math.PI) % (2*Math.PI)) - Math.PI;
}
return (new double[]{lat, lon});
}
I convert the degree bearing to radians and convert the distance (km) into a radians distance before calling the function - so that's not the problem.
However, when I input coordinates such as: lat = 49.25705; lon = -123.140259; with a bearing of 225 (south-west) and a distance of 1km
I get this returned: lat: -1.0085434360125864 lon: -3.7595299668539504
Its obviously not correct, can anyone see what I am doing wrong?
Thanks
If your starting point is (0,0), and your new point is r units away at an angle of θ, you can find the coordinates of that point using the equations x = r cosθ and y = r sinθ.
You can calculate the azimuth between the points (ϕ₁, λ₁) and (ϕ₂ λ₂), where ϕ is the latitude and λ longitude, as follows: Compute x = sinΔλ × cosϕ₂, where Δλ = λ₂ - λ₁ is the difference in longitudes.
It seems like these are the issues in your code:
lat1
and lon1
to radians before calling your function.radialDistance
incorrectly.abs(x-y) < threshold
is safer than x == y
for testing two floating-point numbers x
and y
for equality.lat
and lon
from radians to degrees.Here is my implementation of your code in Python:
#!/usr/bin/env python
from math import asin,cos,pi,sin
rEarth = 6371.01 # Earth's average radius in km
epsilon = 0.000001 # threshold for floating-point equality
def deg2rad(angle):
return angle*pi/180
def rad2deg(angle):
return angle*180/pi
def pointRadialDistance(lat1, lon1, bearing, distance):
"""
Return final coordinates (lat2,lon2) [in degrees] given initial coordinates
(lat1,lon1) [in degrees] and a bearing [in degrees] and distance [in km]
"""
rlat1 = deg2rad(lat1)
rlon1 = deg2rad(lon1)
rbearing = deg2rad(bearing)
rdistance = distance / rEarth # normalize linear distance to radian angle
rlat = asin( sin(rlat1) * cos(rdistance) + cos(rlat1) * sin(rdistance) * cos(rbearing) )
if cos(rlat) == 0 or abs(cos(rlat)) < epsilon: # Endpoint a pole
rlon=rlon1
else:
rlon = ( (rlon1 - asin( sin(rbearing)* sin(rdistance) / cos(rlat) ) + pi ) % (2*pi) ) - pi
lat = rad2deg(rlat)
lon = rad2deg(rlon)
return (lat, lon)
def main():
print "lat1 \t lon1 \t\t bear \t dist \t\t lat2 \t\t lon2"
testcases = []
testcases.append((0,0,0,1))
testcases.append((0,0,90,1))
testcases.append((0,0,0,100))
testcases.append((0,0,90,100))
testcases.append((49.25705,-123.140259,225,1))
testcases.append((49.25705,-123.140259,225,100))
testcases.append((49.25705,-123.140259,225,1000))
for lat1, lon1, bear, dist in testcases:
(lat,lon) = pointRadialDistance(lat1,lon1,bear,dist)
print "%6.2f \t %6.2f \t %4.1f \t %6.1f \t %6.2f \t %6.2f" % (lat1,lon1,bear,dist,lat,lon)
if __name__ == "__main__":
main()
Here is the output:
lat1 lon1 bear dist lat2 lon2
0.00 0.00 0.0 1.0 0.01 0.00
0.00 0.00 90.0 1.0 0.00 -0.01
0.00 0.00 0.0 100.0 0.90 0.00
0.00 0.00 90.0 100.0 0.00 -0.90
49.26 -123.14 225.0 1.0 49.25 -123.13
49.26 -123.14 225.0 100.0 48.62 -122.18
49.26 -123.14 225.0 1000.0 42.55 -114.51
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With