I have the following defined in my app.yaml:
handlers:
- url: /favicon.ico
static_files: img/favicon.ico
upload: noop
- url: /apple-touch-icon.png
static_files: img/apple-touch-icon.png
upload: noop
- url: /images
static_dir: img
- url: /robots.txt
static_files: media/robots.txt
upload: noop
- url: /humans.txt
static_files: media/humans.txt
upload: noop
There are other mappings after the declaration for /humans.txt
but I'll remove them for brevity.
The noop
directory is an empty directory.
However my browser gives me a 404 when I try to access these urls:
Why ?
Luckily, there's a simple fix for this error. All you have to do is update your robots. txt file (example.com/robots.txt) and allow Googlebot (and others) to crawl your pages. You can test these changes using the Robots.
You are able to choose whether you would like to ignore robots exclusions for all hosts within a specific seed (seed level rules) or all instances of a specific host within a collection (collection level rules).
Since you're using static files, upload should match the static_files location:
- url: /robots.txt
static_files: media/robots.txt
upload: media/robots.txt
- url: /humans.txt
static_files: media/humans.txt
upload: media/humans.txt
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With