Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can I prevent search engines from indexing an entire directory on my website?

Tags:

staging

I have a staging site which I use to draft new features, changes and content to my actual website.

I don't want this to get indexed, but I'm hoping for a solution a little easier than having to add the below to every page on my site:

<meta name="robots" content="noindex, nofollow">

Can I do this in a way similar to how I added a password to the domain using a .htaccess file?

like image 487
Marty Avatar asked Jan 29 '12 09:01

Marty


2 Answers

The robots.txt standard is meant for this. Example

User-agent: *
Disallow: /protected-directory/

Search engines will obey this, but of course the content will still be published (and probably more easily discoverable if you put the URL in the robots.txt), so password protection via .htaccess is an option, too.

like image 86
wutz Avatar answered Sep 28 '22 03:09

wutz


What you want is a robots.txt file

The file should be in your server root and the content should be something like;

User-agent: *
Disallow: /mybetasite/

This will politely ask search indexing services not to index the pages under that directory, which all well behaved search engines will respect.

like image 40
Joachim Isaksson Avatar answered Sep 28 '22 04:09

Joachim Isaksson