I want to allow crawling of files in:
/directory/
but not crawling of files in:
/directory/subdirectory/
Is the correct robots.txt instruction:
User-agent: *
Disallow: /subdirectory/
I'm afraid that if I disallowed /directory/subdirectory/ that I would be disallowing crawling of all files in /directory/ which I do not want to do, so am I correct in using:
User-agent: *
Disallow: /subdirectory/
You've overthinking it:
User-agent: *
Disallow: /directory/subdirectory/
is correct.
User-agent: *
Disallow: /directory/subdirectory/
Spiders aren't stupid, they can parse a path :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With