Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to hide a website from search engines

I'm developing a website , and I want to host the development work for the client so he can approve , but I don't want the website to be found by search engines.

How can I hide it ?

I heard about adding robot.txt ?

any advice ?

like image 655
RedhopIT Avatar asked Aug 28 '12 12:08

RedhopIT


4 Answers

You can find it on Wikipedia:

User-agent: *
Disallow: /

Put that in your robot.txt file, which goes in the top-level directory of your web server as described here.

like image 54
woz Avatar answered Sep 19 '22 01:09

woz


If you don't put public links to the website, search engines will not easily find it. However, at some point, there is bound to be a link there if many people discuss the site.

Robots.txt will help for all legitimate search engines, such as Google. It will however not help if someone accidentally finds the url or it is left in browser history.

The most sure way is to put a password on the site: http://www.elated.com/articles/password-protecting-your-pages-with-htaccess/

like image 44
jpa Avatar answered Sep 20 '22 01:09

jpa


Yes, you can use robots.txt. Check out this guide http://www.robotstxt.org/robotstxt.html.

To exclude all robots from the entire server.

User-agent: *
Disallow: /
like image 22
mrswadge Avatar answered Sep 22 '22 01:09

mrswadge


I use robot.txt when I need to hide folders or sites in development. http://www.robotstxt.org/robotstxt.html I think it'll work best for what you need.

like image 29
Andy McCormick Avatar answered Sep 18 '22 01:09

Andy McCormick