Following some trouble getting the Google Crawler to parse our AngularJS site, we're using Prerender to serve a crawler-friendly version of our pages.
This has worked well - except Webmaster Tools indicates that our site speed has worsened considerably, due to Prerender's latency. We're concerned this will impact ranking.
So two questions:
Does Google use the Prerender pages in measuring site speed - or the (true) Javascript-enabled version of our site? We suspect it's the former.
One possible solution is to cache the Prerendered pages. However these cached pages may not perfectly match what the user sees, due to the time delay between the page being put into cache and returned to the crawler - e.g. we may add additional products to the page and the title/metatags reflect the number of products available at any one time. Are these small differences to title, meta descriptions and page content sufficient to risk a cloaking penalty? If so, what is the alternative to caching?
Many thanks for any help.
When it comes to crawl speed, Google uses the Prerender page response time. This is why it's important to cache your pages so that the Prerender server doesn't have to load the page in the browser each time. Returning cached pages will make Googlebot crawl your site very fast.
As long as you are using the ?_escaped_fragment_=
protocol and not matching on the Googlebot user agent, you won't be penalized for cloaking even if the pages differ in the ways you mentioned. Just don't match on the Googlebot user agent and don't try to pad your Prerender pages with keywords and you'll be fine.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With