I was testing my website with online tools and one of the tools gave me this warning:
Your server appears to allow access from User-agent Libwww-perl. Botnet scripts that automatically look for vulnerabilities in your software are sometimes identified as User-Agent libwww-perl. By blocking access from libwww-perl you can eliminate many simpler attacks. Read more on blocking Libwww-perl access and improving your website's security.
My web site is an ASP.NET MVC 5 site and I've simply added these lines to my "robots.txt" file.
User-agent: *
Disallow: /
User-Agent: bingbot
Allow: /
However, the tool still reports the warning. What is the problem? I'm blocking all bots and just set bingbot to allow.
Unless you give the URL or Name of the online scanning tool I can only guess that it tried to crawl your pages while sending a User-Agent: libwww-perl
- not if you block this in your robots.txt.
The Background for this is, robots.txt contains rules for well behaving search engines, not for malware. From http://www.robotstxt.org/robotstxt.html:
I assume to "fix" this warning you must deny all requests for any page, image or file if the HTTP Headers contain User-Agent: libwww-perl
. See this question on configuring IIS to deny these requests without modifying your website.
Personally, I would not deny these requests as it is not worth the hassle. It is easy to change the User-Agent within a scanning tool and most already allow to mimic widely used browsers so the security gain would be very small. On the other hand, there may exist a good / legit tool that cannot be used because it does not fake its identity.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With