Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Geopy: catch timeout error

I am using geopy to geocode some addresses and I want to catch the timeout errors and print them out so I can do some quality control on the input. I am putting the geocode request in a try/catch but it's not working. Any ideas on what I need to do?

Here is my code:

try:
  location = geolocator.geocode(my_address)               
except ValueError as error_message:
  print("Error: geocode failed on input %s with message %s"%(a, error_message))

I get the following exception:

File "/usr/local/lib/python2.7/site-packages/geopy/geocoders/base.py", line 158, in _call_geocoder
    raise GeocoderTimedOut('Service timed out')
    geopy.exc.GeocoderTimedOut: Service timed out

Thank you in advance!

like image 393
MoreScratch Avatar asked Jan 13 '15 03:01

MoreScratch


3 Answers

Try this:

from geopy.geocoders import Nominatim
from geopy.exc import GeocoderTimedOut

my_address = '1600 Pennsylvania Avenue NW Washington, DC 20500'

geolocator = Nominatim()
try:
    location = geolocator.geocode(my_address)
    print(location.latitude, location.longitude)
except GeocoderTimedOut as e:
    print("Error: geocode failed on input %s with message %s"%(my_address, e.message))

You can also consider increasing the timeout on the geocode call you are making to your geolocator. In my example it would be something like:

location = geolocator.geocode(my_address, timeout=10)

or

location = geolocator.geocode(my_address, timeout=None)
like image 157
Imran Avatar answered Oct 15 '22 20:10

Imran


I dealt with the Same Problem for so many days this is my code:

geolocator = Nominatim(user_agent="ny_explorer")
location = geolocator.geocode(address_venue)

ERROR Service timed out

solution: Add a new attribute that declares the timeout:

location = geolocator.geocode(address_venue,timeout=10000)
like image 26
megh_sat Avatar answered Oct 15 '22 22:10

megh_sat


You may be experiencing this problem because you tried to request this address multiple times and they temporarily blocked you or slowed you down because of their usage policy. It states no more requests than one per second and that you should cache your results. I ran into this problem and you have a couple solutions. If you don't want to change your code much you can get a Google API key that you can use for something like 2500 requests/day for free or you can cache your results. Because I was already using DynamoDB on AWS for my problem I went ahead and just created a table that I cache my results in. Here is the gist of my code.

like image 33
tylerjw Avatar answered Oct 15 '22 20:10

tylerjw