Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can search engines index JavaScript generated web pages?

Can search engines such as Google index JavaScript generated web pages? When you right click and select view source in a page that is generated by JavaScript (e.g using GWT) you do not see the dynamically generated HTML. I suppose that if a search engine also cannot see the generated HTML then there is not much to index, right?

like image 947
Roy Avatar asked May 05 '09 18:05

Roy


People also ask

Can Google index JavaScript generated content?

We ran a series of tests that verified Google is able to execute and index JavaScript with a multitude of implementations. We also confirmed Google is able to render the entire page and read the DOM, thereby indexing dynamically generated content.

How does a search engine index a page?

The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page's contents. All of that information is stored in its index.

Is JavaScript good for SEO?

For SEO experts, it is preferable for you to not output JavaScript when search engine crawlers are visiting your webpages, assuming the HTML text content and formatting you return look nearly the same as the ones viewed by humans visiting your sites.


1 Answers

Your suspicion is correct - JS-generated content cannot be relied on to be visible to search bots. It also can't be seen by anyone with JS turned off - and, last time I added some tests to a site I was working on (which was a large, mainstream-audience site, with hundreds of thousands of unique vistors per month), approx 10% of users were not running Javascript in any form. That includes search bots, PC browsers with JS disabled, many mobiles, blind people using screenreaders... etc etc.

This is why content generated via JS (with no fallback option) is a Really Bad Idea.

Back to basics. First, create your site using bare-bones (X)HTML, on REST-like principles (at least to the extent of requiring POST requests for state changes). Simple semantic markup, and forget about CSS and Javascript.

Step one is to get that right, and have your entire site (or as much of it as makes sense) working nicely this way for search bots and Lynx-like user agents.

Then add a visual layer: CSS/graphics/media for visual polish, but don't significantly change your original (X)HTML markup; allow the original text-only site to stay intact and functioning. Keep your markup clean!

Third is to add a behavioural layer: Javascript (Ajax). Offer things that make the experience faster, smoother, nicer for users/browsers with Ajax-capable JS... but only those users. Users without Javascript are still welcome; and so are search bots, the visually impaired, many mobiles, etc.

This is called progressive enhancement in web design circles. Do it this way and your site works, in some reasonable form, for everyone.

like image 144
mattandrews Avatar answered Oct 16 '22 16:10

mattandrews