I have a site that uses wildcard subdomains so that when somebody signs up they get there own subdomain.
I do no want google (or other serach engines) to crawl and index any of the subdomains (accept for www.)
Is there away to do this with robots.txt?
Regards
Im guessing no, at least not directly with one global robots.txt
file. See: http://www.seomoz.org/q/block-an-entire-subdomain-with-robots-txt
Somewhere on that page andykuiper wrote:
you can block an entire subdomain via robots.txt, however you'll need to create a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain's content.
User-agent: * Disallow: /
See also:
Make a script that creates/copies the robots.txt
file to a newly created subdomain and everything should work as intended.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With