I'm building a web application that uses elevations along a path, so I'm using the Maps API to extract elevations. However, it is important that I know the degree of accuracy of this information. The API documentation states that the elevation resolution is a value "indicating the maximum distance between data points from which the elevation was interpolated, in meters." This is exactly what I'm wanting, but the values I get from this resolution seem odd. Try this example given below for the south coast of England.
https://maps.googleapis.com/maps/api/elevation/json?locations=50.8,-1.1
This example gives a resolution of 610 metres. Now having sifted through the net to find out about what google uses for elevation data, SRTM 3 crops up a lot. I have personally used this dataset extensively and I know it has a resolution of 3 arcseconds, which is approx 90m, and is available for most of the world. But google also states that they use many many sources for elevation data to give the best possible coverage. So based on this, I would expect the elevation resolution to be of the order of at least 90m, or better. But sampling across europe this doesn't seem to be the case, and is often well over 100m. And being 610 metres for the south coast of England seems far too high given the type of data Google would be using.
So my question is, can anyone shed any light on this, or am I missing something or even doing something wrong? I know for the US some regions give great resolution (~4 metres), but as I say, intuitively you would expect the world to have at least 90m res.
The API returns the resolution of the coarsest asset contributing to the sample tile. In this case there's 30 arc second bathymetry contributing which was sampled to 600m+ upon ingest. The rest of the tile is SRTM which has a resolution after ingest of ~75m
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With