I submitted my robots.txt file ages ago to Google and it is still giving me a syntax not understood for the first line.
After Googling the most common problem is Google adding a '?' at the start of the line but it isnt doing that to me.
the url to the robots.txt is
www.leisurepursuits.co.uk/robots.txt
The error is:
Line 1: User-agent: * Syntax not understood
Luckily, there's a simple fix for this error. All you have to do is update your robots. txt file (example.com/robots.txt) and allow Googlebot (and others) to crawl your pages. You can test these changes using the Robots.
If you don't want your crawler to respect robots. txt then just write it so it doesn't. You might be using a library that respects robots. txt automatically, if so then you will have to disable that (which will usually be an option you pass to the library when you call it).
Allow directive in robots. txt. The Allow directive is used to counteract a Disallow directive. The Allow directive is supported by Google and Bing. Using the Allow and Disallow directives together you can tell search engines they can access a specific file or page within a directory that's otherwise disallowed.
This error shows because the expected robots.txt file format is plain text encoded in UTF-8. The file consists of records (lines) separated by CR, CR/LF or LF.
If a character encoding is not a subset of UTF-8, then your robots.txt files is being parsed incorrectly.
First check your robots.txt url in http://www.asymptoticdesign.co.uk/cgi-bin/check-url.pl by selecting the second option “view source” and see that it responds successfully or not.
Now upload your robots.txt file again. It will not show error as shown previously.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With