Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Robots.txt block access to all https:// pages [closed]

Tags:

robots.txt

What would the syntax be to block all access to any bots to https:// pages? I have an old site that now doesn't have an SSL and I want to block access to all https:// pages

like image 910
YodasMyDad Avatar asked Apr 12 '26 08:04

YodasMyDad


1 Answers

I don’t know if it works, if the robots use/request different robots.txt for different protocols. But you could deliver a different robots.txt for requests over HTTPS.

So when http://example.com/robots.txt is requested, you deliver the normal robots.txt. And when https://example.com/robots.txt is requested, you deliver the robots.txt that disallows everything.

like image 53
Gumbo Avatar answered Apr 14 '26 23:04

Gumbo



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!