Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Are AJAX sites crawlable by search engines?

I had always assumed that AJAX-driven content was invisible to search engines.

(i.e. content inserted into the DOM via XMLHTTPRequest)

For example, in this site, the main content is loaded via AJAX request by the browser:

http://www.trustedsource.org/query/terra.cl

...if you view this page with Javascript disabled, the main content area is blank.

However, Google cache shows the full content after the AJAX load:

http://74.125.155.132/search?q=cache:JqcT6EVDHBoJ:www.trustedsource.org/query/terra.cl+http://www.trustedsource.org/query/terra.cl&cd=1&hl=en&ct=clnk&gl=us

So, apparently search engines do index content loaded by AJAX.

Questions:

  • Is this a new feature in search engines? Most postings on the web indicate that you have to publish duplicate static HTML content for search engines to find them.
  • Are there any tricks to get an AJAX-driven content to be crawled by search engines (besides creating duplicate static HTML content).
  • Will the AJAX-driven content be indexed if it is loaded from a separate subdomain? How about a separate domain?
like image 241
frankadelic Avatar asked Jul 23 '09 06:07

frankadelic


People also ask

Does Google index AJAX content?

Even though Google can usually index dynamic AJAX content, it's not always that simple.

Is AJAX SEO friendly?

Yes, AJAX can be SEO friendly and Single Page Interface (AJAX intensive) applications can also work with JavaScript disabled (SEO compatible). Take a look to this demo.

Does Google search use AJAX?

The Google AJAX Search API is designed to make it easier for webmasters and developers to do two things: Add a dynamic search box to your site that includes Google Web, Video, News, Maps, and Blog search results. Build powerful web apps on top of Google search. See some samples.

Does Google crawl search results?

Once Google discovers a page's URL, it may visit (or "crawl") the page to find out what's on it. We use a huge set of computers to crawl billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider).


2 Answers

Following this guide from Google, AJAX sites may be made crawlable:

http://code.google.com/intl/sv-SE/web/ajaxcrawling/docs/getting-started.html

like image 176
frankadelic Avatar answered Sep 19 '22 10:09

frankadelic


Search engines could run the JavaScript needed to index Ajax content, but it would be difficult and computationally expensive — I'm not aware of any that actually do.

A well written site will, if it uses Ajax, use it according to the principles of progressive enhancement. Any key functionality will still be available without needing to run the JavaScript.

On the other hand, sites which reinvent frames (and don't use progressive enhancement) using JavaScript will suffer from all the usual problems of frames, but trade orphan pages for search engine invisibility.

like image 20
Quentin Avatar answered Sep 18 '22 10:09

Quentin