Is it possible to ban certain user agents directly from web.config? Certain robots seem not to follow robots.txt, and to avoid pointless server load (and log-file spamming) I'd like to prevent certain classes of request (in particular based on user-agent or very perhaps IP-address) from proceeding.
Bonus points if you know if it's similarly possible to prevent such requests from being logged to IIS's log-file entirely. (i.e. if-request-match, forward to /dev/null, if you get my meaning).
A solution for win2003 would be preferable, but this is a recurring problem - if there's a clean solution for IIS7 but not IIS6, I'd be happy to know it.
Edit: Sorry 'bout the incomplete question earlier, I had tab+entered accidentally.
This can be done pretty easily using the URLRewrite module in IIS7. But I really don't know if this will prevent those requests from being logged.
<rewrite>
<rules>
<rule name="Ban user-agent RogueBot" stopProcessing="true">
<match url=".*" />
<conditions>
<add input="{HTTP_USER_AGENT}" pattern="RogueBotName" />
<add input="{MyPrivatePages:{REQUEST_URI}}" pattern="(.+)" />
</conditions>
<action type="AbortRequest" />
</rule>
</rules>
<rewriteMaps>
<rewriteMap name="MyPrivatePages">
<add key="/PrivatePage1.aspx" value="block" />
<add key="/PrivatePage2.aspx" value="block" />
<add key="/PrivatePage3.aspx" value="block" />
</rewriteMap>
</rewriteMaps>
</rewrite>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With