Can anyone give me a good reason why not to use the hijax (Progressive enhancement) method in addition to the hashbang method google proposes? As far as i can see, the hijax method is still the better one:
The only counter argument i found so far is when they click on a link in a search engine and you have javascript enabled you'll need to do a redirect to the javascript enabled version (with the #-tag).
For Google's hashbang version it's difficult to supply a no-javascript based version and Bing and Yahoo can't crawl your website.
Kind regards,
Daan
The "value allocation" answer isn't quite correct.
The question is regarding surfacing content for search engines. Hashbang is Google's answer for that. That said, a user (or another search engine or social network scraper that doesn't support hashbang) who doesn't have JS enabled will never see your content. Google can see it because they're the one's checking for hashbang.
Hijax, on the other hand, always allows non-JS users/bots to see your content because it does not rely on hash/hashbang. Hijax relies on standard query string parameters. This means your application must have back-end logic to render your content for non-JS user agents. In the end, with Hijax JS enabled users get the asynchronous experience and non-JS enabled users get full page loads.
Google continues to recommend Hijax. Hashbang is their offering for non-hijax apps already out there in the wild, and/or JS apps that don't have a back-end.
http://googlewebmastercentral.blogspot.com/2007/11/spiders-view-of-web-20.html (see progressive enhancement section)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With