Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ruby code to check if a website has search engine friendly URLs

I am developing an application in rails which requires to check whether the entered website has search engine friendly URLs generated or not.A solution i have in mind is using nokogiri to parse the HTML of the site and look in the link tag for finding URLs and see if they are search engine friendly.Is there any other way this can be done?Any help would be really great.

like image 984
Jimmy Thakkar Avatar asked Jul 03 '12 08:07

Jimmy Thakkar


1 Answers

You have two problems here:

  1. How do you formally (programmatically) define what a "search engine frienldy URL is". I'm assuming you have some way of doing this already. So that leaves...

  2. How to check all the links on a website.

So for (2) I would look at something like Anemone which will make it easy for you to crawl complete websites:

Anemone is a Ruby library that makes it quick and painless to write programs that spider a website. It provides a simple DSL for performing actions on every page of a site, skipping certain URLs, and calculating the shortest path to a given page on a site.

The multi-threaded design makes Anemone fast. The API makes it simple. And the expressiveness of Ruby makes it powerful.

For simple crawling Anemone will even give you an array of all links on a page, so you won't necessarily even need Nokogiri. For more complex stuff maybe you want to combine Anemone with something like Mechanize and Nokogiri. That depends on your requirements.

like image 54
Casper Avatar answered Sep 21 '22 07:09

Casper