Say I have a site on http://example.com. I would really like allowing bots to see the home page, but any other page need to blocked as it is pointless to spider. In other words
http://example.com & http://example.com/ should be allowed, but http://example.com/anything and http://example.com/someendpoint.aspx should be blocked.
Further it would be great if I can allow certain query strings to passthrough to the home page: http://example.com?okparam=true
but not http://example.com?anythingbutokparam=true
So after some research, here is what I found - a solution acceptable by the major search providers: google , yahoo & msn (I could on find a validator here) :
User-Agent: * Disallow: /* Allow: /?okparam= Allow: /$
The trick is using the $ to mark the end of URL.
Google's Webmaster Tools report that disallow always takes precedence over allow, so there's no easy way of doing this in a robots.txt
file.
You could accomplish this by puting a noindex,nofollow
META
tag in the HTML every page but the home page.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With