How do I get a a complete list of all the urls that my rails application could generate?
I don't want the routes that I get get form rake routes, instead I want to get the actul URLs corrosponding to all the dynmically generated pages in my application...
Is this even possible?
(Background: I'm doing this because I want a complete list of URLs for some load testing I want to do, which has to cover the entire breadth of the application)
TIP: If you ever want to list all the routes of your application you can use rails routes on your terminal and if you want to list routes of a specific resource, you can use rails routes | grep hotel . This will list all the routes of Hotel.
What is the best way to get the current request URL in Rails? You should use request. original_url to get the current URL.
You can write request. url instead of request. request_uri . This combines the protocol (usually http://) with the host, and request_uri to give you the full address.
I was able to produce useful output with the following command:
$ wget --spider -r -nv -nd -np http://localhost:3209/ 2>&1 | ack -o '(?<=URL:)\S+'
http://localhost:3209/
http://localhost:3209/robots.txt
http://localhost:3209/agenda/2008/08
http://localhost:3209/agenda/2008/10
http://localhost:3209/agenda/2008/09/01
http://localhost:3209/agenda/2008/09/02
http://localhost:3209/agenda/2008/09/03
^C
wget
arguments:# --spider don't download anything.
# -r, --recursive specify recursive download.
# -nv, --no-verbose turn off verboseness, without being quiet.
# -nd, --no-directories don't create directories.
# -np, --no-parent don't ascend to the parent directory.
ack
ack
is like grep
but use perl
regexps, which are more complete/powerful.
-o
tells ack
to only output the matched substring, and the pattern I used looks for anything non-space preceded by 'URL:'
You could pretty quickly hack together a program that grabs the output of rake routes
and then parses the output to put together a list of the URLs.
What I have, typically, done for load testing is to use a tool like WebLOAD and script several different types of user sessions (or different routes users can take). Then I create a mix of user sessions and run them through the website to get something close to an accurate picture of how the site might run.
Typically I will also do this on a total of 4 different machines running about 80 concurrent user sessions to realistically simulate what will be happening through the application. This also makes sure I don't spend overly much time optimizing infrequently visited pages and can, instead, concentrate on overall application performance along the critical paths.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With