Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spider/Robot UserAgent Detection C#

I'm working on an application that redirects users to upgrade browser if they are not on our browser list.

My goal is to create an exception to detect if they are a crawler, based on their UserAgent string.

At this point, I'm getting the a message ..."no definition or extension method for .ToLower"

Here is my code:

    private bool IsValidCrawler(HttpRequestBase request)
    {
        bool isCrawler = true;

        switch (request.Browser.Crawler.ToLower())  
        {
            case "googlebot":
            case "bingbot":
            case "yahoo!":
            case "facebookexternalhit":
            case "facebookplatform":
                break;
        }

        return isCrawler;
    }

Can anyone point me to where I have gone wrong?

like image 643
Mark Avatar asked Dec 16 '22 04:12

Mark


1 Answers

If you look at the documentation for the Crawler property (http://msdn.microsoft.com/en-us/library/system.web.configuration.httpcapabilitiesbase.crawler(v=vs.110).aspx) you'll notice it's a boolean type.

The property itself will let you know if the request is coming from a known crawler. You can try the following for the time being. Leaving the method so you don't have to change too much.

private bool IsValidCrawler(HttpRequestBase request)
{
    bool isCrawler = request.Browser.Crawler;

    return isCrawler;
}
like image 145
Babak Naffas Avatar answered Dec 31 '22 10:12

Babak Naffas