Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Google-Plus Crawler

Tags:

google-plus

Currently, my company is attempting to add Google Plus One links to our site.

We have the code working, however it appears that the Google-Plus Crawler is unable to access the page content. When the share link snippet is created, it renders with a message stating that the crawler is unable to view the content because it fails a test to differentiate bots from human visitors.

We can white-list the bot, however the system we are using only accepts a User-Agent and a URL. When the User-Agent is detected, a reverse look up is run and the bot ip is compared to the url that was entered to see if it comes from the same set of ips.

I know that the Google Plus crawler does not use a bot style user-agent, like Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html), but is there a user-agent we can perform the necessary white-list test on?

like image 328
Dxcv Avatar asked Dec 23 '13 19:12

Dxcv


2 Answers

Yes it does. The +Snippet bot user agent contains the following string:

Google (+https://developers.google.com/+/web/snippet/)
like image 50
abraham Avatar answered Jan 03 '23 21:01

abraham


This is what the user agent returned for me:

Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0 Google (+https://developers.google.com/+/web/snippet/)

like image 37
Devin Dixon Avatar answered Jan 03 '23 21:01

Devin Dixon