I am running an ASP.NET web application under IIS 7.5 and my Application log is full of errors like this:
Event code: 3012
Event message: An error occurred processing a web or script resource request. The resource identifier failed to decrypt.
...
Exception information:
Exception type: HttpException Exception message: Unable to validate data.
at System.Web.Configuration.MachineKeySection.EncryptOrDecryptData(Boolean fEncrypt, Byte[] buf, Byte[] modifier, Int32 start, Int32 length, Boolean useValidationSymAlgo, Boolean useLegacyMode, IVType ivType, Boolean signData)
...
Request information:
Request URL: http://www.mysite.com/WebResource.axd?d=l0ngstr1ng0fl3tt3rs4ndd1g1ts Request path: /WebResource.axd
...
How can I prevent them from appearing? As per this link, I have added the following code to my Global.asax file:
void Application_Error(object sender, EventArgs e)
{
// Code that runs when an *unhandled* error occurs
//// get reference to the source of the exception chain
Exception ex = Server.GetLastError();
string message = ex.Message;
string path = Request.Path;
// ignore the following:
// errors due to bots trying AXD URLs
// errors due to <doNastyThings /> tags in the URLs
if (
(ex is HttpException && (path.StartsWith("/WebResource.axd") || path.StartsWith("/ScriptResource.axd"))) ||
(ex is HttpException && message.StartsWith("A potentially dangerous Request.Path value was detected from the client"))
)
{
// clear the error *to prevent it from appearing in the main Application log*
Server.ClearError();
// need to manually direct to the error page, since it will no longer happen automatically once the error has been cleared
Response.Redirect("/Error");
}
}
The second group of errors (for potentially dangerous requests) are being caught and repressed by this code; the WebResource.axd errors have already been written to the Application log by the time this code is executed, however. I'm presuming that's because the AXD handler works differently to the standard ASPX handler in terms of error logging (but I have no idea what to do as a result).
All help gratefully received!
I only get this error when I get requests from the Bingbot crawler. You can check if it's the bing bot here
So I added this in my robots.txt file. It doesn't work if you don't add specifically that it's the user-agent Bingbot
User-agent: bingbot
Disallow: /ScriptResource.axd
Disallow: /combinescriptshandler.axd
Disallow: /WebResource.axd
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With