I have a website that must have javascript turned on so it can work
there is a < noscript> tag that have a meta to redirect the user to a page that alerts him about the disabled javascript...
I am wondering, is this a bad thing for search engine crawlers?
Because I send an e-mail to myself when someone doesn't have js so I can analyze if its necessary to rebuild the website for these people, but its 100% js activated and the only ones that doesn't have JS are searchengines crawlers... I guess google, yahoo etc doesn't take the meta refresh seriously when inside a < noscript> ?
Should I do something to check if they are bots and do not redirect them with meta?
Thanks,
Joe
Instead of forcefully sending the user/bot why not just make text appear at the top of the page stating to enable javascript in order to use the site?
This will allow the bots to still read the page and follow the non-javascript links. This would end the problems with being redirected and there would be no need to serve bots a different page. Which would make you update multiple pages.
You may also want to take a look at google webmaster tools to see what all google is currently reading and improve based on that.
Example: disabling javascript on SO creates a red banner at the top that just states "Stack Overflow works best with JavaScript enabled" you could make that linkable to a page with more info if you feel its not enough.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With