Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Stop Google indexing subdomains

I have a site that uses wildcard subdomains so that when somebody signs up they get there own subdomain.

I do no want google (or other serach engines) to crawl and index any of the subdomains (accept for www.)

Is there away to do this with robots.txt?

Regards

like image 219
pjknight Avatar asked Oct 04 '22 23:10

pjknight


1 Answers

Im guessing no, at least not directly with one global robots.txt file. See: http://www.seomoz.org/q/block-an-entire-subdomain-with-robots-txt

Somewhere on that page andykuiper wrote:

you can block an entire subdomain via robots.txt, however you'll need to create a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain's content.

User-agent: * Disallow: /

See also:

  • http://productforums.google.com/forum/#!topic/webmasters/f5AfW1otAUo
  • Disallow or Noindex on Subdomain with robots.txt
  • Subdomain disallow search bots via robots.txt

Make a script that creates/copies the robots.txt file to a newly created subdomain and everything should work as intended.

like image 193
ZZ-bb Avatar answered Oct 10 '22 03:10

ZZ-bb