The website I work on hosts content that constantly gets scraped and posted elsewhere.
Is is possible to do URL rewrites so normal users and white-listed crawlers can view the website, but block access to unidentifiable browsers?
If someone really wants to scrape your content i guess its only a matter of time till he adapts his technice to fake an allowed browser. Still serving different content per user agent is a nice feature to explore.
Yes, you can do that using URL Rewrite module (I'm using v2 .. but it should work with v1.x as well, although I have no v1.x around to test):
<system.webServer>
<rewrite>
<rules>
<rule name="UserAgentRedirect" stopProcessing="true">
<match url="^(.*)$" />
<conditions>
<add input="{HTTP_USER_AGENT}" pattern="(iphone|ipod)" />
</conditions>
<action type="Rewrite" url="/special-page.aspx" />
</rule>
</rules>
</rewrite>
</system.webServer>
With the above rule ALL requests from iPhone or iPad (or any other browser/app that has iphone
or ipod
in User Agent String) will be rewritten (internal redirect) to /special-page.aspx
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With