We use backbone heavily for rendering our pages. All the data is passed as json from the server and the html is created on the client with backbone and mustache. This poses a big problem for SEO. One way that I was planning to get around this was to detect if the request is from a bot and use something like HtmlUnit to render the page on the server and spit it out. Would love some alternate ideas. Also would like to know if there's a flaw in what I'm planning to do.
Build your site using Progressive Enhancement and Unobtrusive JavaScript.
When you do significant Ajax stuff, use the history API.
Then you have real URLs for everything and Google won't be a problem.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With