Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best Way to log API Calls, per minute / per hour

We are using reverse-geocoding in a rails webservice, and have run into quota problems when using the Google reverse geocoder through geokit. We are also implementing the simple-geo service, and I want to be able to track how many requests per minute/hour we are making.

Any suggestions for tracking our reverse-geocoding calls?

Our code will look something like the following. Would you do any of these?

  • Add a custom logger and process in the background daily
  • Use a super-fantastic gem that I don't know about that does quotas and rating easily
  • Insert into database a call and do queries there.

Note: I don't need the data in real-time, just want to be able to know in an hourly period, what's our usual and max requests per hour. (and total monthly requests)

def use_simplegeo(lat, lng)
  SimpleGeo::Client.set_credentials(SIMPLE_GEO_OAUTHTOKEN, SIMPLE_GEO_OAUTHSECRET)
  # maybe do logging/tracking here?
  nearby_address = SimpleGeo::Client.get_nearby_address(lat, lng)

  located_location = LocatedLocation.new
  located_location.city = nearby_address[:place_name]
  located_location.county = nearby_address[:county_name]
  located_location.state = nearby_address[:state_code]
  located_location.country = nearby_address[:country]
  return located_location

end

Thanks!

like image 500
Jesse Wolgamott Avatar asked Jul 26 '10 15:07

Jesse Wolgamott


1 Answers

The first part here is not answering the question you are asking but my be helpful if haven't considered it before.

Have you looked at not doing your reverse geocoding using your server (i.e. through Geokit) but instead having this done by the client? In other words some Javascript loaded into the user's browser making Google geocoder API calls on behalf of your service.

If your application could support this approach than this has a number of advantages:

  • You get around the quota problem because your distributed users each have their own daily quota and don't consume yours
  • You don't expend server resources of your own doing this

If you still would like to log your geocoder queries and you are concerned about the performance impact to your primary application database then you might consider one of the following options:

  1. Just create a separate database (or databases) for logging (which write intensive) and do it synchronously. Could be relational but perhaps MongoDB or Redis might work either
  2. Log to the file system (with a custom logger) and then cron these in batches into structured, queriable storage later. The storage could be external such as on Amazon's S3 if that works better.
  3. Just write a record into SimpleGeo each time you do a Geocode and add custom meta-data to those records to tie them back to your own model(s)
like image 106
bjg Avatar answered Sep 22 '22 20:09

bjg