Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I filter out fictional locations (ex. "under a rock", "hiding") from Google Maps API geocode results?

Google Maps API does a great job trying to locate a match for nearly every query. But if I'm only interested in real locations, how can I filter out Google's guesses?

For example, according to Google, "under a rock" is located at "The Rock, Shifnal, Shropshire TF11, UK". But a person who answers the question, "Where are you?" with "Under a rock" does not mean to indicate that they are in Shropshire, UK. Instead they just don't want to tell you — well, either that or they are in real trouble, thankfully with web access, stuck under some rock.

I have several million user generated location strings that I'm attempting to find coordinates for. If someone writes "under a rock" I'd rather just leave the coordinates null instead of putting an obviously wrong point in Shropshire, UK.

Here are some other examples:

  • under a rock => Shropshire, UK
  • planet earth => Cheshire, UK
  • nowhere => Scituate, RI, USA
  • travelling => Madrid, Spain
  • hiding => Anderson, CA, USA
  • global => Midland, TX, USA
  • on the web => North Part, ON, Canada
  • internet => Frisco, TX, USA
  • worldwide => Mie Prefecture, Japan

Ultimately I'm after a solid way to return coordinates from a string but return false if the location is like the above.

I need to build a function that returns the following:

  • Twin Cities => Return the colloquial coordinates of Minneapolis-St. Paul
  • right behind you => false [Google get's this one "right" -- at least for my purposes]
  • under a rock => false
  • nowhere => false
  • Canada => Return coordinates
  • Mission District San Francisco => Return coordinates
  • Chicago => Return coordinates
  • a galaxy far far away => false [Google also get's this "right" — zero results]

What do you recommend?

Here's a comma-delimited array for you to play at home:

'twin cities','right behind you','under a rock','nowhere','canada','mission district san francisco','chicago','a galaxy far far away','london, england','1600 pennsylvania ave, washington, d.c.','california','41.87194,12.56738','global','worldwide','on the internet','mars'

And here's the url format:

'http://maps.googleapis.com/maps/api/geocode/json?address=' + query + '&sensor=false'
ex: http://maps.googleapis.com/maps/api/geocode/json?address=twin+cities&sensor=false
like image 365
Ryan Avatar asked Jan 08 '14 20:01

Ryan


3 Answers

I know there are Bayes Classifier implementations in javascript. Never tried them though, I currently use a Ruby implementation which works correctly.

You could have two classifications (Real and Unreal), training each of them with how many samples you want (30, 50 samples each?). "If your classifier is well trained, it will be more accurate".

Then you'd have to test the location before calling GoogleMaps API to filter Unreal locations.

like image 86
Hugo Chevalier Avatar answered Nov 10 '22 13:11

Hugo Chevalier


This might not be the direct answer to your question.

If you are currently going through 1000s of user input saved in db, and filter out the invalid ones, I think it is too late and not feasible. The output can be only good as input.

The better way is to make input as valid as possible, and end users don't always know what they want.

I would suggest you that user enter their address through autocomplete, so that you will always have the valid address

  1. User enters text, and select the suggestions
  2. An marker and info window will be shown
  3. When user confirms input, you save info window text as user input, not from text input.

By doing this way, you don't need to validate or filter user input.

like image 27
allenhwkim Avatar answered Nov 10 '22 11:11

allenhwkim


To truly succeed here you are going to have to build a database driven system that facilitates both positive and negative lookups with AI that gets smarter over time, just like Google did. I don't believe that there is a single algorithm that will filter out results based on cosmetics alone.

I looked around and found a site that contains every city in the world. Unfortunately, it doesn't give it as a single list so you'd have to spend a bit of time harvesting data. the site is http://www.fallingrain.com/world/index.html.

They seem to be using individual directories for organizing countries, states, and cities. Then, broken down further by alphabet. It is however the only comprehensive that I could find.

If you manage to get all of these locations into a database then you will have the beginnings of a positive lookup system for your queries. Also, you'll need to start building separate lists of bi, tri, and quad-city areas as well as popular destinations and land marks.

You should also store a negative lookup table for all known mismatches. People have a tendency to generate similar false data and type-o's across large populations. So, the most popular "nowhere" and "planet earth" answers will be repeated over and over again and, in every language you can think of.

One of the benefits of this strategy is that you can run relational queries against your data to get matches in bulk instead as well as one at a time. Since some false negatives will occur at the beginning then your main decision is to determine what you want to do with unmatched items. You may want to adopt a strategy where you have the ability to both reject non-matches as well as substituting partial matches with the nearest actual match.

Anyhow, I hope this helps. It is a bit of effort but if it's important it will be worth it. Who knows, you may end up with a database that's actually worth something. Maybe even a Google maps gateway service for companies/developers who need the same functionality. (:

Take care.

like image 41
drankin2112 Avatar answered Nov 10 '22 12:11

drankin2112