Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Generic Performance Testing Framework For .NET

I have a client/server application written in C#/.NET 3.5 that I want to do a bit of performance testing on. I've been looking for a generic framework to help me but not had much luck. I would like something that can manage a set of clients and perform random actions for them based on some settings. I would also then like to record some data relating to this to help me work outsome rough thresholds for my system, e.g. I can support n users performing x actions per second.

I would write code specific to my application to perform tasks such as:

  • Login/logout a client.
  • Send messages to the server to perform various actions.
  • Record acknowledgements and other messages from the server.
  • Measure statistics specific to the system.

I'm hoping the framework will then be able to take a set of parameters to describe a testing scenario such as:

  • Number of clients logged in at a given time.
  • Perform a given number of actions per second for each client.

It would then run the scenario, manage and track all of the users and actions and collate all of the data. (This is the boring bit I'm trying to avoid coding myself...) Ideally it would have some general measurements built in, e.g. time between sending a message and receiving a response, but I could code them myself if not.

I don't want to do any profiling of my code; I can always attach a profiler whilst running these tests later on. Instead I want to make some rough conclusions about my system, i.e. how many users can I throw at it before it breaks. (If there is a better term for this than 'performance testing' please let me know... Stress testing maybe?)

I realise I'm not giving very many specifics about the system here. It strikes me as a fairly general situation - I'm sure there are lots of client/server systems out there that people need to do similar tests on. I've found lots of web based frameworks to do similar things but they seem to be pretty web ingrained and don't lend themselves easily to non-HTTP based systems.

Anyone know of anything that might help? My searching hasn't found anything yet. I should point out that I'm stuck with Visual Studio 2008 Professional for the foressable future so if 2010 can do this it's out of bounds for me. I guess it doesn't have to be a .NET framework provided I can still plugin my .NET code fairly easily.

EDIT To be clear my application isn't a website, it's a Windows Forms client application that connects via a custom protocol to a .NET service. I can write code to perform the relevant client actions, I just need a framework to put it in.

like image 874
MrKWatkins Avatar asked Sep 28 '11 09:09

MrKWatkins


2 Answers

The keyword you are looking for is "Distributed testing".

Smart Bear have a product called TestComplete which supports distributed testing. I don't think it can run multiple instances of your client on a single machine though (maybe it can, but I guess it's not a good idea any way since it would impact the performance results).

They also have an open source project called LoadUI, it is built to integrate with SoapUI, however you might be able to hook it up to your own client-test tool. I have no idea how much effort that would cost.

These are the tools I know of, but there are many more distributed testing tools out there. While most are indeed for web-bases testing, they often are extensible enough to simply kick off a different (GUI-based) testing framework (my favorite is QAliber) which runs the tests on your client app.

like image 166
Louis Somers Avatar answered Sep 22 '22 04:09

Louis Somers


To my knowledge, such a framework does not currently exist (and it is likely it won't exist because everyones scenarios are so different). Because "Performance" means different things to different people and different projects, you will usually have to roll your own. Thankfully, with the advent of .NET4 and the TPL this has become easier than ever before.

There are three aspects a Performance test needs to cover:

  1. Define "Load"
  2. Measure "Performance"
  3. Evaluate the Results

As you can imagine, both of these are totally lax definitions. Load could be measured in Requests per Second, Logged In Users etc... Performance could be measured in Response Time, Memory Usage... you get the picture.

So your first step is to define a "unit load", that you can easily scale up. For example you could create a ClientUnitLoad class, that simulates a client and periodically performs actions against your service. The TPL makes it easy to scale your ClientUnitLoad (except you are bound by hardware constraints of course).

The next step is to measure Performance. How you measure heavily depends on the metric you want to collect: Simple Stopwatching is fine 90% of the cases but can only measure time. Building a custom tracing and metrics collection infrastructure is definitely worth it, as you will also need it in production to verify your system performs as expected in the real world.

The last step is to evaluate your Metrics. The largest issue with Performance Tests is that: a) they are slow, and b) unreliable. Theres no way around that. In general, you need large sample sizes (repeated test runs) and some sort of statistical analysis on your metrics to ignore outliers. The processed results should then be compared to a configurable performance baseline.

Depending on the desired level of sophistication, that performance baseline may have simple acceptance criteria (such as: Repsonse Time under 100ms) or advanced, specific criteria (such as: Repsonse Time does not decrease by more than 10% when the amount of logged in users is 10 vs. 1).

like image 21
Johannes Rudolph Avatar answered Sep 25 '22 04:09

Johannes Rudolph