Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Preventing direct access to robots.txt through .htaccess

I want to prevent users from accessing my robots.txt file but I still want search engines to read it. Is it possible? If yes then how do I do it? I believe if I write following in .htaccess it will work but I am afraid it will also block search engines from accessing it.

order deny, allow deny from all

Thanks

like image 771
Ali Avatar asked Oct 12 '25 12:10

Ali


1 Answers

Since standard robots.txt is served from the root of your domain unless you can somehow reliably distinguish search engines from users I don't think what you are asking is possible.

You could try filtering by user agent or possibly by IP range.

Is there a reason why you don't want your users to not see what is in your robots.txt file? After all everything in that file is public.

like image 76
martineno Avatar answered Oct 14 '25 10:10

martineno



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!