This is a followup to this question.
I seem to be stuck on this. Basically, I need to be able to convert back and forth to referring to coordinates either in the standard degree system OR by measuring a distance north from the south pole along the international date line, and then a distance east starting from that point on the date line. To do this (as well as some more general distance-measuring stuff), I have one method for determining the distance between two lat/lon points, and another method that takes a lat/lon point, a heading and a distance, and returns the lat/lon point at the end of that course.
Here are the two static methods I've defined:
/* Takes two lon/lat pairs and returns the distance between them in kilometers.
*/
public static double distance (double lat1, double lon1, double lat2, double lon2) {
double theta = toRadians(lon1-lon2);
lat1 = toRadians(lat1);
lon1 = toRadians(lon1);
lat2 = toRadians(lat2);
lon2 = toRadians(lon2);
double dist = sin(lat1)*sin(lat2) + cos(lat1)*cos(lat2)*cos(theta);
dist = toDegrees(acos(dist)) * 60 * 1.1515 * 1.609344 * 1000;
return dist;
}
/* endOfCourse takes a lat/lon pair, a heading (in degrees clockwise from north), and a distance (in kilometers), and returns
* the lat/lon pair that would be reached by traveling that distance in that direction from the given point.
*/
public static double[] endOfCourse (double lat1, double lon1, double tc, double dist) {
double pi = Math.PI;
lat1 = toRadians(lat1);
lon1 = toRadians(lon1);
tc = toRadians(tc);
double dist_radians = toRadians(dist / (60 * 1.1515 * 1.609344 * 1000));
double lat = asin(sin(lat1) * cos(dist_radians) + cos(lat1) * sin(dist_radians) * cos(tc));
double dlon = atan2(sin(tc) * sin(dist_radians) * cos(lat1), cos(dist_radians) - sin(lat1) * sin(lat));
double lon = ((lon1-dlon + pi) % (2*pi)) - pi;
double[] endPoint = new double[2];
endPoint[0] = lat; endPoint[1] = lon;
return endPoint;
}
And here's the function I'm using to test it:
public static void main(String args[]) throws java.io.IOException, java.io.FileNotFoundException {
double distNorth = distance(0.0, 0.0, 72.0, 0.0);
double distEast = distance(72.0, 0.0, 72.0, 31.5);
double lat1 = endOfCourse(0.0, 0.0, 0.0, distNorth)[0];
double lon1 = endOfCourse(lat1, 0.0, 90.0, distEast)[1];
System.out.println("end at: " + lat1 + " / " + lon1);
return;
}
The "end at" values should be appx. 72.0 / 31.5. But instead I'm getting approximately 1.25 / 0.021.
I assume I must be missing something stupid, forgetting to convert units somewhere, or something... Any help would be greatly appreciated!
UPDATE 1:
I had (correctly) written the distance function to return meters, but wrote kilometers in the comments by mistake ... which of course confused me when I came back to it today. Anyway, now that's fixed, and I've fixed the factoring error in the endOfCourse method, and I also realized I'd forgotten to convert back to degrees from radians in that method too. Anyway: while it appears I'm now getting the correct latitude number (71.99...), the longitude number is way off (I get 3.54 instead of 11.5).
UPDATE 2: I had a typo in the test, as mentioned below. It's now fixed in the code. The longitude number is still, however, wrong: I'm now getting -11.34 instead of 11.5. I think there must be something wrong with these lines:
double dlon = atan2(sin(tc) * sin(dist_radians) * cos(lat1), cos(dist_radians) - sin(lat1) * sin(lat));
double lon = ((lon1-dlon + pi) % (2*pi)) - pi;
For this divide the values of longitude and latitude of both the points by 180/pi. The value of pi is 22/7. The value of 180/pi is approximately 57.29577951. If we want to calculate the distance between two places in miles, use the value 3, 963, which is the radius of Earth.
A degree of longitude is about 111 kilometers (69 miles) at its widest. The widest areas of longitude are near the Equator, where the Earth bulges out. Because of the Earth's curvature, the actual distance of a degrees, minutes, and seconds of longitude depends on its distance from the Equator.
You've got a serious case of the magic numbers in the code. The expression:
(60 * 1.1515 * 1.609344 * 1000)
appears twice, but there's not much explanation of it. With some help: 1.609344 is the number of kilometres in a mile; 60 is the number of minutes in a degree; 1000 is the number of metres in a kilometre; and 1.1515 is the number of statute miles in a nautical mile (thanks, DanM). One nautical mile is the length of one minute of latitude at the equator.
I assume you are using a spherical earth model, rather than a spheroidal earth? The algebra isn't complex enough to be spheroidal.
The first formula - conversion between two latitude and longitude pairs - is odd. You need both delta-lat (Δλ) and delta-lon (Δφ) to sort out the answer. Further, the distance between the pairs:
(60° N, 30° W), (60° N, 60° W)
(60° N, 60° W), (60° N, 90° W)
should be the same - but I'm pretty sure your code produces different answers.
So, I think you need to go back to your spherical trigonometry reference materials and see what you're doing wrong. (It would take me a while to find my book on the subject - it would need to be unpacked from whichever box it is in.)
[...time passes...unpacking done...]
Given a spherical triangle with angles A, B, C at the vertices and sides a, b, c opposite those vertices (that is, side a is from B to C, etc.), the Cosine Formula is:
cos a = cos b . cos c + sin b . sin c . cos A
Applying this to the problem, we can call the two points given B and C, and we create a right spherical triangle with a right angle at A.
ASCII art at its worst:
+ C
/|
/ |
a / | b
/ |
/ |
/ |
B +------+ A
c
The side c is equal to the difference in longitude; the side b is equal to the difference in latitude; the angle A is 90°, so cos A = 0. Therefore, I believe an equation for a is:
cos a = cos Δλ . cos Δφ + sin Δλ . sin Δφ . cos 90°
a = arccos (cos Δλ . cos Δφ)
The angle a in radians is then converted to a distance by multiplying by the radius of the Earth. Alternatively, given a in degrees (and fractions of a degree), then there are 60 nautical miles to one degree, hence 60 * 1.1515 statute miles, and 60 * 1.1515 * 1.609344 kilometres to one degree. Unless you want the distance in metres, I don't see a need for the factor of 1000.
Paul Tomblin points to Aviation Formulary v1.44 as a source of the equation - and indeed, it is there, together with a more numerically stable version for when the difference in position is small.
Going to basic trigonometry, we also know that:
cos (A - B) = cos A . cos B + sin A . sin B
Applying that twice in the equation I gave might well end up at the formula in the Aviation Formulary.
(My reference: "Astronomy: Principles and Practice, Fourth Edition" by A E Roy and D Clarke (2003); my copy is the first edition from 1977, Adam Hilger, ISBN 0-85274-346-7.)
NB Check out (Google) 'define:"nautical mile"'; it appears that a nautical mile is now 1852 m (1.852 km) by definition. The multiplier 1.1515 corresponds to the old definition of the nautical mile as approximately 6080 ft. Using bc
with a scale of 10, I get:
(1852/(3*0.3048))/1760
1.1507794480
Which factor works for you depends on what your basis is.
Looking at the second problem from first principles, we have a slightly different setup, and we need the 'other' spherical trigonometry equation, the Sine Formula:
sin A sin B sin C
----- = ----- = -----
sin a sin b sin c
Adapting the previous diagram:
+ C
/|
/ |
a / | b
| / |
|X/ |
|/ |
B +------+ A
c
You are given starting point B, angle X = 90º - B, length (angle) a, and angle A = 90°. What you are after is b (the delta in latitude) and c (the delta in longitude).
So, we have:
sin a sin b
----- = ----
sin A sin B
Or
sin a . sin B
sin b = -------------
sin A
Or, since A = 90°, sin A = 1, and sin B = sin (90° - X) = cos X:
sin b = sin a . cos X
That means you convert the distance travelled into an angle a, take the sine of that, multiply by the cosine of the course direction, and take the arcsine of the result.
Given a, b (just calculated) and A and B, we can apply the cosine formula to get c. Note that we cannot simply re-apply the sine formula to get c since we don't have the value of C and, because we're playing with spherical trigonometry, there is no convenient rule that C = 90° - B (the sum of the angles in a spherical triangle can be greater than 180°; consider an equilateral spherical triangle with all angles equal to 90°, which is perfectly feasible).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With