Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Integrating Automated Web Testing Into Build Process

I'm looking for suggestions to improve the process of automating functional testing of a website. Here's what I've tried in the past.

I used to have a test project using WATIN. You effectively write what look like "unit tests" and use WATIN to automate a browser to click around your site etc.

Of course, you need a site to be running. So I made the test actually copy the code from my web project to a local directory and started a web server pointing to that directory before any of the tests run.

That way, someone new could simply get latest from our source control and run our build script, and see all the tests run. They could also simply run all the tests from the IDE.

The problem I ran into was that I spent a lot of time maintaining the code to set up the test environment more than the tests. Not to mention that it took a long time to run because of all that copying. Also, I needed to test out various scenarios including installation, meaning I needed to be able to set the database to various initial states.

I was curious on what you've done to automate functional testing to solve some of these issues and still keep it simple.

MORE DETAILS Since people asked for more details, here it is. I'm running ASP.NET using Visual Studio and Cassini (the built in web server). My unit tests run in MbUnit (but that's not so important. Could be NUnit or XUnit.NET). Typically, I have a separate unit test framework run all my WATIN tests. In the AssemblyLoad phase, I start the webserver and copy all my web application code locally.

I'm interested in solutions for any platform, but I may need more descriptions on what each thing means. :)

like image 410
Haacked Avatar asked Aug 06 '09 16:08

Haacked


2 Answers

Phil,

Automation can just be hard to maintain, but the more you use your automation for deployment, the more you can leverage it for test setup (and vice versa).

Frankly, it's easier to evolve automation code, factoring it and refactoring it into specific, small units of functionality when using a build tool that isn't
just driving statically-compiled, pre-factored units of functionality, as is the case with NAnt and MSBuild. This is one of the reasons that many people who were relatively early users of toole like NAnt have moved off to Rake. The freedom to treat build code as any other code - to cotinually evolve its content and shape - is greater with Rake. You don't end up with the same stasis in automation artifacts as easily and as quickly with Rake, and it's a lot easier to script in Rake than NAnt or MSBuild.

So, some part of your struggle is inherently bound up in the tools. To keep your automation sensible and maintained, you should be wary of obstructions that static build tools like NAnt and MSBuild impose.

I would suggest that you not couple your test environment boot-strapping from assembly load. That's an inside-out coupling that only serves brief convenience. There's nothing wrong (and, likely everything right) with going to the command line and executing the build task that sets up the environment before running tests either from the IDE or from the command line, or from an interactive console, like the C# REPL from the Mono Project, or from IRB.

Test data setup is simply just a pain in the butt sometimes. It has to be done.

You're going to need a library that you can call to create and clean up database state. You can make those calls right from your test code, but I personally tend to avoid doing this because there is more than one good use of test data or sample data control code.

I drive all sample data control from HTTP. I write controllers with actions specifically for controlling sample data and issue GETs against those actions through Selenium. I use these to create and clean up data. I can compose GETs to these actions to create common scenarios of setup data, and I can pass specific values for data as request parameters (or form parameters if needs be).

I keep these controllers in an area that I usually call "test_support".

My automation for deploying the website does not deploy the test_support area or its routes and mapping. As part of my deployment verification automation, I make sure that the test_support code is not in the production app.

I also use the test_support code to automate control over the entire environment - replacing services with fakes, turning off subsystems to simulate failures and failovers, activating or deactivating authentication and access control for functional testing that isn't concerned with these facets, etc.

There's a great secondary value to controlling your web app's sample data or test data from the web: when demoing the app, or when doing exploratory testing, you can create the data scenarios you need just by issuing some gets against known (or guessable) urls in the test_support area. Really making a disciplined effort to stick to restful routes and resource-orientation here will really pay off.

There's a lot more to this functional automation (including test, deployment, demoing, etc) so the better designed these resources are, the better the time you'll have maintaining them over the long hall, and the more opportunities you'll find to leverage them in unforseen but beneficial ways.

For example, writing domain model code over the semantic model of your web pages will help create much more understandable test code and decrease the brittleness. If you do this well, you can use those same models with a variety of different drivers so that you can leverage them in stress tests and load tests as well as functional test as well as using them from the command line as exploratory tools. By the way, this kind of thing is easier to do when you're not bound to driver types as you are when you use a static language. There's a reason why many leading testing thinkers and doers work in Ruby, and why Watir is written in Ruby. Reuse, composition, and expressiveness is much easier to achieve in Ruby than C# test code. But that's another story.

Let's catch up sometime and talk more about the other 90% of this stuff :)

like image 98
Scott Bellware Avatar answered Oct 12 '22 10:10

Scott Bellware


We used Plasma on one project. It emulates a web server in process - just point it at the root of your web application project.

It was surprisingly stable - no copying files or starting up an out of process server.

Here is how a test using Plasma looks for us...

    [Test]
    public void Can_log_in() {
        AspNetResponse response = WebApp.ProcessRequest("/Login.aspx");
        AspNetForm form = response.GetForm();

        form["UserName"] = User.UserName;

        form["Password"] = User.Password;

        AspNetResponse loggedIn = WebApp.ProcessRequest(Button.Click(form, "LoginUser"));


        Assert.IsTrue(loggedIn.IsRedirect());

        AspNetResponse homePage = WebApp.ProcessRequest(loggedIn.GetRedirectUrl());

        Assert.AreEqual(homePage.Status, 200);
    }

All the "AspNetResponse" and "AspNetForm" classes are included with Plasma.

like image 34
Mike Avatar answered Oct 12 '22 11:10

Mike