Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using web.config to ban user-agents

Tags:

asp.net

iis

Is it possible to ban certain user agents directly from web.config? Certain robots seem not to follow robots.txt, and to avoid pointless server load (and log-file spamming) I'd like to prevent certain classes of request (in particular based on user-agent or very perhaps IP-address) from proceeding.

Bonus points if you know if it's similarly possible to prevent such requests from being logged to IIS's log-file entirely. (i.e. if-request-match, forward to /dev/null, if you get my meaning).

A solution for win2003 would be preferable, but this is a recurring problem - if there's a clean solution for IIS7 but not IIS6, I'd be happy to know it.

Edit: Sorry 'bout the incomplete question earlier, I had tab+entered accidentally.

like image 916
Eamon Nerbonne Avatar asked Jul 23 '09 16:07

Eamon Nerbonne


1 Answers

This can be done pretty easily using the URLRewrite module in IIS7. But I really don't know if this will prevent those requests from being logged.

 <rewrite> 
  <rules> 
    <rule name="Ban user-agent RogueBot" stopProcessing="true"> 
      <match url=".*" /> 
      <conditions> 
        <add input="{HTTP_USER_AGENT}" pattern="RogueBotName" /> 
        <add input="{MyPrivatePages:{REQUEST_URI}}" pattern="(.+)" /> 
      </conditions> 
      <action type="AbortRequest" /> 
    </rule> 
  </rules> 
  <rewriteMaps> 
    <rewriteMap name="MyPrivatePages"> 
      <add key="/PrivatePage1.aspx" value="block" /> 
      <add key="/PrivatePage2.aspx" value="block" />
      <add key="/PrivatePage3.aspx" value="block" /> 
    </rewriteMap> 
  </rewriteMaps> 
</rewrite>
like image 180
Albert Walker Avatar answered Sep 21 '22 06:09

Albert Walker