Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Asp.net Request.Browser.Crawler - Dynamic Crawler List?

I learned Why Request.Browser.Crawler is Always False in C# (http://www.digcode.com/default.aspx?page=ed51cde3-d979-4daf-afae-fa6192562ea9&article=bc3a7a4f-f53e-4f88-8e9c-c9337f6c05a0).

Does anyone uses some method to dynamically update the Crawler's list, so Request.Browser.Crawler will be really useful?

like image 373
Click Ok Avatar asked Jan 10 '09 21:01

Click Ok


2 Answers

I've been happy the the results supplied by Ocean's Browsercaps. It supports crawlers that Microsoft's config files has not bothered detecting. It will even parse out what version of the crawler is on your site, not that I really need that level of detail.

like image 82
DavGarcia Avatar answered Nov 11 '22 12:11

DavGarcia


You could check (regex) against Request.UserAgent.

Peter Bromberg wrote a nice article about writing an ASP.NET Request Logger and Crawler Killer in ASP.NET.

Here is the method he uses in his Logger class:

public static bool IsCrawler(HttpRequest request)
{
   // set next line to "bool isCrawler = false; to use this to deny certain bots
   bool isCrawler = request.Browser.Crawler;
   // Microsoft doesn't properly detect several crawlers
   if (!isCrawler)
   {
       // put any additional known crawlers in the Regex below
       // you can also use this list to deny certain bots instead, if desired:
       // just set bool isCrawler = false; for first line in method 
       // and only have the ones you want to deny in the following Regex list
       Regex regEx = new Regex("Slurp|slurp|ask|Ask|Teoma|teoma");
       isCrawler = regEx.Match(request.UserAgent).Success;
   }
   return isCrawler;
}
like image 35
splattne Avatar answered Nov 11 '22 13:11

splattne