Currently, my company is attempting to add Google Plus One links to our site.
We have the code working, however it appears that the Google-Plus Crawler is unable to access the page content. When the share link snippet is created, it renders with a message stating that the crawler is unable to view the content because it fails a test to differentiate bots from human visitors.
We can white-list the bot, however the system we are using only accepts a User-Agent and a URL. When the User-Agent is detected, a reverse look up is run and the bot ip is compared to the url that was entered to see if it comes from the same set of ips.
I know that the Google Plus crawler does not use a bot style user-agent, like Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html), but is there a user-agent we can perform the necessary white-list test on?
Yes it does. The +Snippet bot user agent contains the following string:
Google (+https://developers.google.com/+/web/snippet/)
This is what the user agent returned for me:
Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0 Google (+https://developers.google.com/+/web/snippet/)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With