Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What best practices do you use for testing database queries?

I'm currently in the process of testing our solution that has the whole "gamut" of layers: UI, Middle, and the omnipresent Database.

Before my arrival on my current team, query testing was done by the testers manually crafting queries that would theoretically return a result set that the stored procedure should return based on various relevancy rules, sorting, what have you.

This had the side effect of bugs being filed against the tester's query more often than against the actual query in question.

I proposed actually working with a known result set that you could just infer how it should return since you control the data present -- previously, data was pulled from production, sanitized, and then populated in our test databases.

People were still insistent on creating their own queries to test what the developers have created. I suspect that many still are. I have it in my mind that this isn't ideal at all, and just increases our testing footprint needlessly.

So, I'm curious, which practices do you use to test scenarios like this, and what would be considered ideal for the best end-to-end coverage you can get, without introducing chaotic data?

The issue I have is where's the best place to do what testing. Do I just poke the service directly, and compare that dataset to that which I can pull from the stored procedure? I have a rough idea, and have been successful enough so far, but I feel like we're still missing something important here, so I'm looking to the community to see if they have any valuable insights that might help formulate my testing approach better.

like image 604
Steven Raybell Avatar asked Nov 03 '08 23:11

Steven Raybell


3 Answers

Testing stored procs will require that each person who tests has a separate instance of the db. This is a requirement. If you share environments you won't be able to rely upon the results of your test. They'll be worthless.

You will also need to ensure that you roll back the db to it's previous state after every test so as to make the results predictable and stable. Because of this need to roll back the state after every test these tests will take a lot longer to complete than standard unit tests so they'll probably be something you want to run over night.

There are a few tools out there to help you with this. DbUnit is one of them and I also believe Microsoft had a tool Visual Studio for Database Professionals that contained some support for DB testing.

like image 69
Justin Bozonier Avatar answered Nov 20 '22 09:11

Justin Bozonier


Here are some guidelines:

  1. Use an isolated database for unit testing (e.g. No other test runs or activity)
  2. Always insert all the test data you intend to query within the same test
  3. Write the tests to randomly create different volumes of data e.g. random number of inserts say between 1 and 10 rows
  4. Randomize the data e.g. for a boolean field random insert and true or false
  5. Keep a count in the test of the variables (e.g. number of rows, number of trues)
  6. For the Asserts execute query and compare against local test variables
  7. Use Enterprises Services transactions to rollback database to previous state

See the link below for the Enterprises Services Transaction technique:

http://weblogs.asp.net/rosherove/articles/DbUnitTesting.aspx

like image 3
user32378 Avatar answered Nov 20 '22 09:11

user32378


As part of our continuous integration, we run our nightly 'build' of the database queries. This involves a suite of DB calls which are updated regularly from the real calls in the code as well as any expected ad-hoc queries.

These calls are timed to ensure that:

1/ They don't take too long.

2/ They don't differ wildly (in a bad way) from the previous night.

In this way, we catch errant queries or DB changes quickly.

like image 1
paxdiablo Avatar answered Nov 20 '22 09:11

paxdiablo