Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

robots txt disallow wild card

I am having trouble stopping google crawling a few urls which cause errors.

I want to stop

  • /project/123984932842/download/pdf
  • /project/123984932842/download/zip

but allow

  • /project/123984932842
  • /project/123984932842/flat

I tried project/*/download/pdf but it doesn't seem to work. Does anyone know what would?

like image 690
henry.oswald Avatar asked Oct 21 '25 06:10

henry.oswald


1 Answers

Do you have a / at the beginning of the Disallow: line?

User-agent: googlebot
Disallow: /project/*/download/pdf
like image 166
John Kugelman Avatar answered Oct 24 '25 14:10

John Kugelman



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!