Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Google URL Shortener 403 Rate Limit Exceeded

Using the google url shortener api, it was working fine till I started testing at load. Quickly started getting back 403 Rate Limit Exceeded errors from Google, even though I signed up to use the API and it comes with 1,000,000 hits a day. I can see the requests coming in on the google reporting tool, and they are just sending back 403's for everything. 403's started coming back at around 345/350 hits to the API, have been continuing for hours.

Thoughts?

like image 469
MyBrokenGnome Avatar asked Jul 21 '13 18:07

MyBrokenGnome


People also ask

How do I fix the 403 user rate limit exceeded?

Resolve a 403 error: Project rate limit exceededRaise the per-user quota in the Google Cloud project. For more information, request a quota increase. Batch requests to make fewer API calls. Use exponential backoff to retry the request.

How do I fix rate limit exceeded?

Steps to Fix User Rate Limit Exceeded Issue To fix this issue, you need to raise the limits in Google APIs. Step 1: Sign in to your Google developers console project. Step 2: Select the project from the top panel. Step 3: Select the project from the menu options.

Does Google URL shortener still work?

On March 30th 2018, Google announced they would be closing Google URL shortener to new users from April 13th, with existing users being able to use the service until its eventual discontinuation on March 30th 2019.

Why did Google discontinue URL shortener?

The decision to shut down goo.gl and move to FDL is a result of changes in the ways that people share information online. In its effort to keep up with these changes, Google is moving to “smart URLs” that redirects users in a variety of ways that were not possible back in 2009.


2 Answers

The API limits requests to 1 request / per second / per user.

A user is defined as a unique IP address.

So if you were doing your load testing from a single IP this would have cause your rate limit issue.

https://developers.google.com/analytics/devguides/reporting/mcf/v3/limits-quotas#general_api

like image 128
Chase Avatar answered Sep 22 '22 06:09

Chase


I don't think "1 request / per second / per user." as written in doc is 100% correct in my case, or the google url shortener case. (FYI: I am using "Public API access", not "OAuth")

I have almost the same problem but, for me, it is more likely to be "I get this error for some URLs for some period of times." What does it mean? Please continue reading.

These are what I found:

  • I can use 10 threads to use google url shortener at the same time, but not always ...
  • when processing, even one url is fail on one thread, the other threads still can get the other urls.
  • when a url is fail, and later I tried the same url again (even there are no other processes running, it still does not work for some PERIOD OF TIME. Even, I tried to add more string like "&test=1", it does not help. But if I changed to another url, it works.

So, I guess that google's server may have cache of each url. If a url is fail, it must wait for a while to let the cache released.

So, I have to write some creepy code like this to solve my problem:

  • when there is a fail, that particular thread will sleep for 1 minute (yes 1 minute)
  • and keep trying for 10 times (so totally, it can be 10 minutes for a fail url)

However, this creepy code is fine for my case because I am using ExecutorService with fixed-thread-pool size of 10. So, if there is a fail, the others still can get the shorten urls. It solves the problem...at least for me.

like image 32
Surasin Tancharoen Avatar answered Sep 21 '22 06:09

Surasin Tancharoen