Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MSTest unit tests and database access without touching the actual database

In my code I interact with a database (not part of my solution file). The database is owned by a separate team of DBA's, and the code we developers write is only allowed to access stored procs. However we have full view of the database's procs, tables, and columns (it's definition). For my code that is dependent upon data, I currently write unit tests that dumb-up data in the tables (and tear down/remove those rows after the unit test is done), so I can run unit tests to exercise my code that interacts with the DB. All of the code to do this is in the test file (especially in the ClassInitialize() and ClassCleanup() functions). However I've been given some amount of grief from my new coworkers call my style of unit tests "destructive" because I read/write to the dev database inserting and removing rows. At the time we code the unit tests, the database design is generally not stable, so many times we can find issues in the stored proc code before we unleash the QA department on our programs (saves resources). They all tell me there's a way to clone to the database into memory at the time the MSTest unit tests are run, however they don't know how to do it. I've researched around the web and cannot find a way to do what my coworkers need me to do.

Can someone tell me for sure whether or not this can happen in the environment I shown above? If so, can you point me in the right direction?

like image 483
DMCS Avatar asked Nov 15 '10 17:11

DMCS


2 Answers

Do you have SQL scripts that can be used to create your database? You should have, and they should be under version control. If so, then you can do the following:

In your test setup code:

  • create a 'temporary' database using the SQL scripts. Use a unique name, for example unitTestDatabase_[timestamp].
  • setup the data you require for your test in the test database. Ideally using public API functions (eg CreateUser, AddNewCustomer), but where the required API does not exist, use SQL commands. Using the API to set up test data makes the tests more robust against changes to the low-level implementation (i.e. database schema). Which is one reason why we write unit tests, to ensure that changes to the implementation do not break functionality.
  • run your unit tests, using dependency injection to pass the test database connection string from the test code into the code under test.

and in your test teardown code, delete the database. Ideally should be done using your database uninstall scripts, which should also be under version control.

You can control how often you want to create a unit test database: e.g. per test project, test class or test method, or a combination, by creating the database in either an [AssemblyInitialize], [ClassInitialize] or [TestInitialize] method.

This is a technique we use with great success. The advantages are:

  • every time we run the unit tests, we are testing that our database installation scripts work together with the code.
  • test isolation, that is the tests only affect their test database. And it doesn't matter if the rollback code goes wrong, you are not touching anyone else's data.
  • Confidence in the code. That is, because we are using a real database, the unit tests give me more confidence that the code works than if I was mocking the database. Of course, this depends on how good your suite of higher level integration/component tests are.

Disadvantages:

  • the unit tests are dependant on an external system (the DBMS). You will need to find the name of a DBMS in your test setup code. This can be done by using a config file or by looking at run time for a running local DBMS.
  • Tests may be slowed down by the database installation scripts. In our experience, the tests are still running quickly enough, and there are plenty of opportunities to optimize. We run our test suite of approx 400 unit tests in approx 1 min, which includes creating 5 separate database on a local installation of SQLServer 2008.
like image 150
GarethOwen Avatar answered Sep 20 '22 11:09

GarethOwen


If you can create a 'seam' between the business logic code and your data access layer you should be ok. Use interfaces to represent the contract your DAL exposes to your business logic and then either write your own set of Fake objects or use a mocking tool such as rhino-mocks.

If you are writing tests that hit that database then you have a huge maintenance headache, since as you state, the database is changing, and also it makes it difficult to maintain an environment that has access to the database. What you are actually writing are integration tests, which are still valid, but true unit test's shouldnt have any dependencies on databases, file system, etc.

like image 44
Ben Cawley Avatar answered Sep 18 '22 11:09

Ben Cawley