Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Prerender caching risks Google cloaking penalty?

Following some trouble getting the Google Crawler to parse our AngularJS site, we're using Prerender to serve a crawler-friendly version of our pages.

This has worked well - except Webmaster Tools indicates that our site speed has worsened considerably, due to Prerender's latency. We're concerned this will impact ranking.

So two questions:

  1. Does Google use the Prerender pages in measuring site speed - or the (true) Javascript-enabled version of our site? We suspect it's the former.

  2. One possible solution is to cache the Prerendered pages. However these cached pages may not perfectly match what the user sees, due to the time delay between the page being put into cache and returned to the crawler - e.g. we may add additional products to the page and the title/metatags reflect the number of products available at any one time. Are these small differences to title, meta descriptions and page content sufficient to risk a cloaking penalty? If so, what is the alternative to caching?

Many thanks for any help.

like image 613
user3700505 Avatar asked Mar 19 '15 23:03

user3700505


1 Answers

  1. When it comes to crawl speed, Google uses the Prerender page response time. This is why it's important to cache your pages so that the Prerender server doesn't have to load the page in the browser each time. Returning cached pages will make Googlebot crawl your site very fast.

  2. As long as you are using the ?_escaped_fragment_= protocol and not matching on the Googlebot user agent, you won't be penalized for cloaking even if the pages differ in the ways you mentioned. Just don't match on the Googlebot user agent and don't try to pad your Prerender pages with keywords and you'll be fine.

like image 130
Prerender.io Avatar answered Oct 25 '22 02:10

Prerender.io