I know you can go to http://webcache.googleusercontent.com/search?q=cache:http://example.com/ to view Google's cache of any URL, but do they provide an API to hit thousands of these and pay for access?
I don't want to just make HTTP GETs to these URLs too fast and get my IP addresses banned or upset Google.
Just wondering if they offer a way to pay and do this through official channels like they do with their search API.
Google keeps webpages in their cache for about 90 days, or until the page is crawled again.
Applications using the Places API are bound by the terms of your Agreement with Google. Subject to the terms of your Agreement, you must not pre-fetch, index, store, or cache any Content except under the limited conditions stated in the terms.
Google doesn't seem to have an API to access the cached results:
There are some attempts to scrape it and wrap it in APIs, such as this perl module
Other than that the Wayback Machine has an API, of cached versions of sites. Perhaps that will do?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With