Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Do search engines process Javascript?

According to this page it would seem like they don't, in the sense that they don't actually run it, but that page is 2 years old (judging from the copyright info).

The reason I'm asking this question is because we use Javascript to replace text on our site with other more typographically sound content. We're worried that this may affect the crawlability/seo of our sites, since generally what we're replacing is headers; ie. <h1>, <h2>, etc.

Will search engine bots see our original code, or will they run the Javascript and see the replaced text?

like image 719
Matthew Scharley Avatar asked Jan 15 '10 00:01

Matthew Scharley


3 Answers

Search engines don't process JavaScript as such.

There is some evidence that Google may have started processing inline script content in some cases, in order to catch content that is entered into the page parse queue using document.write. However certainly DOM methods such as you might use for font-replacement are not affected and no onload code is invoked.

like image 145
bobince Avatar answered Nov 03 '22 09:11

bobince


Google now officially processes JavaScript.

In order to solve this problem, we decided to try to understand pages by executing JavaScript. It’s hard to do that at the scale of the current web, but we decided that it’s worth it. We have been gradually improving how we do this for some time. In the past few months, our indexing system has been rendering a substantial number of web pages more like an average user’s browser with JavaScript turned on.

  • Sometimes things don't go perfectly during rendering, which may negatively impact search results for your site. Here are a few potential issues, and – where possible, – how you can help prevent them from occurring:
  • If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, our indexing systems won’t be able to see your site like an average user. We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile. If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.
  • It's always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn't have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can't execute JavaScript yet.
  • Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.
  • Some JavaScript removes content from the page rather than adding, which prevents us from indexing the content.
like image 34
John Conde Avatar answered Nov 03 '22 09:11

John Conde


Generally no. Google has mentioned that they are working on a system of indexing ajax content, but I don't think any of the major search engines index dynamic content as a rule. See this page for Google's take on it: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=81766

like image 2
Glenn Slaven Avatar answered Nov 03 '22 10:11

Glenn Slaven