Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to (unit-)test data intensive PL/SQL application

Our team is willing to unit-test a new code written under a running project extending an existing huge Oracle system.

The system is written solely in PL/SQL, consists of thousands of tables, hundreds of stored procedures packages, mostly getting data from tables and/or inserting/updating other data.

Our extension is not an exception. Most functions return data from a quite complex SELECT statementa over many mutually bound tables (with a little added logic before returning them) or make transformation from one complicated data structure to another (complicated in another way).

What is the best approach to unit-test such code?

There are no unit tests for existing code base. To make things worse, only packages, triggers and views are source-controlled, table structures (including "alter table" stuff and necessary data transformations are deployed via channel other than version control). There is no way to change this within our project's scope.

Maintaining testing data set seems to be impossible since there is new code deployed to the production environment on weekly basis, usually without prior notice, often changing data structure (add a column here, remove one there).

I'd be glad for any suggestion or reference to help us. Some team members tend to be tired by figuring out how to even start for our experience with unit-testing does not cover PL/SQL data intensive legacy systems (only those "from-the-book" greenfield Java projects).

like image 815
Karel Smutný Avatar asked Apr 19 '10 14:04

Karel Smutný


4 Answers

There are several different test tools for PL/SQL out there. Steven Feuerstein has written two of them, utplsql and Quest Code Tester for Oracle (formerly QUTE). I am a big fan of utplsql, but it no longer has an active support community (which is a shame). It also tends to be quite verbose, especially when it comes to setting up test fixtures. It does have the cardinal virtual of being pure PL/SQL packages; the source code is a bit gnarly but it is FOSS.

QCTO comes with a GUI, which means - like other Quest products i.e. TOAD - it is Windows only. It doesn't exactly automate test data generation, but it provides an interface to support it. Also like other Quest products, QCTO is licensed although there is a freeware copy.

Steven (disclosure, he he is one of my Oracle heroes) has written a feature comparison of all the PL/SQL testing tools. Obviously, QOTC comes out tops, but I think the comparison is honest. Check it out.

Advice on test fixtures in utplsql

Managing test data for unit testing can be a real pain in the neck. Unfortunately utplsql doesn't offer much to shoulder the burden. So

  • Always test against known values:
    • Avoid using dbms_random;
    • Try to restrict the use of sequences to columns whose values don't matter;
    • Dates are also tricky. Avoid hard-coding dates: use variables which are populated with sysdate. Learn to appreciate add_months(), last_day(), interval, trunc(sysdate, 'MM'), etc.
  • Isolate the test data from other users. Build it from scratch. Use distinctive values wherever possible.
  • Only create as much test data as you need. Volumetric testing is a different responsibility.
  • When testing procedures which change the data create specific records for each unit test.
  • Also: don't rely on the successful output from one test to provide the input from another test.
  • When testing procedures which simply report against data share records between unit tests when appropriate.
  • Share framework data (e.g. referenced primary keys) whenever possible.
  • Use free text fields (names, descriptions, comments) to identify which test or tests use the record.
  • Minimise the work involved in creating new records:
    • Only assign values which are necessary to the test suite and the table's constraints;
    • Use default values as much as possible;
    • Proceduralize as much as possible.

Other things to bear in mind:

  • setting up a test fixture can be a time-consuming exercise. If you have a lot of data consider building a procedure to set up the static data which can be run once per session, and include only volatile data in the ut_setup itself. This is especially helpful when testing read-only functionality.
  • remember that creating test data is a programming exercise in its own right, and so prone to bugs.
  • use all the features of utplsql. utAssert.EqQuery, utAssert.EqQueryValue, utAssert.EqTable, utAssert.EqTabCount and utAssert.Eq_RefC_Query are all very useful features when it comes to inferring the values of volatile data.
  • when diagnosing a test run which didn't go the way we were expecting it can be useful to have the data which was used. So consider having a hollow ut_teardown procedure and clearing down the test data at the start of ut_setup.

Dealing with legacy code

Commenting on Gary's post reminded me of one other thing you may find useful. Steven F wrote ulplsql as a PL/SQL implementation of JUnit, the Java vanguard of the Test First movement. However, the techniques of TDD can be also applied to large amounts of legacy code (in this context, legacy code is any set of programs without any unit tests).

The key thing to bear in mind is that you don't have to get everything under unit test immediately. Start incrementally. Build unit tests for new stuff, Test First. Build unit tests for the bits you're going to change before you apply the change, so you know they still work after you have made the change.

There is a lot of thought in this area, but (inevitably if shamefully) it mainly comes from the OO programmers. Michael Feathers is the main chap. Read his article Working Effectively With Legacy Code. If you find it helpful he subsequently wrote a book of the same name.

like image 84
APC Avatar answered Nov 16 '22 13:11

APC


Take the following scenario

FUNCTION ret_count (local_client_id IN number) RETURN NUMBER IS
  v_ret number;
BEGIN
  SELECT count(*) INTO v_ret FROM table_1 WHERE client_id = local_client_id;
  RETURN v_ret;
END;

Very simple function but there is a whole mess of stuff that can go wrong. Datatype conversions, indexing, stats could all impact query paths, performance and, in some cases, errors. There's also a lot of loose coupling such as session settings (eg linguistic preferences). If someone came along and added a column "LOCAL_CLIENT_ID" to table_1, the whole logic of the function changes.

I personally don't feel that TDD is suitable for this environment. Code reviews by knowledgable individuals will have a greater chance of catching problems.

If you've got money, then look at Oracle's RAT (real application testing) for regression testing.

like image 25
Gary Myers Avatar answered Nov 16 '22 15:11

Gary Myers


I've used DbFit to unit-test PL/SQL code. Give it a try.

like image 40
Oliver Michels Avatar answered Nov 16 '22 15:11

Oliver Michels


utPLSQL might help, but it sounds like you need a better way of maintaining test data. You might want to also look at Swingbench for that.

like image 21
dpbradley Avatar answered Nov 16 '22 15:11

dpbradley