For SEO purpose, i'm tring to make some files accessible from url like http://example.ca/robots.txt but i'm confronted to a strange issue. The files are accessible with firefox but chromium and google bots can't get those files !
my routes:
# Map static resources from the /public folder to the /assets URL path
GET /assets/*file controllers.Assets.at(path="/public", file)
# Robots and Humans files
GET /$file<(robots|humans).txt> controllers.Assets.at(path="/public", file)
GET /$file<MJ12_576CD562EFAFA1742768BA479A39BFF9.txt> controllers.Assets.at(path="/public", file)
I'm not sure if it will make a difference, but try:
GET /robots.txt controllers.Assets.at(path="/public", file="robots.txt")
GET /humans.txt controllers.Assets.at(path="/public", file="humans.txt")
GET /MJ12_576CD562EFAFA1742768BA479A39BFF9.txt controllers.Assets.at(path="/public", file="MJ12_576CD562EFAFA1742768BA479A39BFF9.txt")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With