I'm developing a website , and I want to host the development work for the client so he can approve , but I don't want the website to be found by search engines.
How can I hide it ?
I heard about adding robot.txt ?
any advice ?
You can find it on Wikipedia:
User-agent: *
Disallow: /
Put that in your robot.txt file, which goes in the top-level directory of your web server as described here.
If you don't put public links to the website, search engines will not easily find it. However, at some point, there is bound to be a link there if many people discuss the site.
Robots.txt will help for all legitimate search engines, such as Google. It will however not help if someone accidentally finds the url or it is left in browser history.
The most sure way is to put a password on the site: http://www.elated.com/articles/password-protecting-your-pages-with-htaccess/
Yes, you can use robots.txt. Check out this guide http://www.robotstxt.org/robotstxt.html.
To exclude all robots from the entire server.
User-agent: *
Disallow: /
I use robot.txt when I need to hide folders or sites in development. http://www.robotstxt.org/robotstxt.html I think it'll work best for what you need.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With