Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Continuous Integration: keeping the test DB schema up-to-date

I'm setting up a continuous integration server (Hudson) to build a Java project and run the relevant unit/integration tests. Most of these tests access a database and the test data is kept in a DbUnit XML file.

I'm looking for a way to automatically keep the test database schema up-to-date. Currently the SQL script for a particular release is stored in a directory named after the release version:

└───scripts
    ├───0.1.0
    ├───0.1.1
    ├───0.1.2
    ├───0.1.4

For example, the SQL script for version 0.1.4 is

scripts\0.1.4\script-0.1.4.sql

The problem is that these scripts contain a mixture of schema changes (e.g. ALTER TABLE...) and changes to the static tables (e.g. add a new role to the USER_TYPE table).

In the case of the unit tests I only want to apply the schema changes, because as mentioned above, all the data for the unit tests is kept in a DbUnit XML file. Although I could separate these two types of database changes into different files, there will often be a dependency between the schema changes and the data changes that will need to be somehow enforced when the release is being applied to QA, production, etc.

Anyway, this is just a very long-winded way of asking whether anyone has come up with a robust way to automatically keep their test schema up-to-date? I know Unitils has some support for keeping a test schema up-to-date, but I'm not sure if it can 'ignore' data update statements in the SQL delta scripts.

like image 428
Dónal Avatar asked Jan 20 '09 14:01

Dónal


3 Answers

A previous poster listed Liquibase as an option, however they failed to mention Liquibase's ability to define rules which run in particular contexts (Contexts in Liquibase). This allows you to have the schema updates not marked with any particular context and the fixtures for the unit tests marked as a context of test. This way, the fixtures will only be inserted when you run your unit tests.

Here is an example of a Liquibase change set that contains the schema and the fixtures:

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog/1.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog/1.9 http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-1.9.xsd">
  <changeSet author="avalade" id="1">
    <createTable tableName="users">
      <column autoIncrement="true" name="id" type="long">
        <constraints nullable="false" primaryKey="true" />
      </column>
      <column name="email" type="varchar(255)" />
    </createTable>
  </changeSet>
  <changeSet author="avalade" id="2" context="test">
    <insert tableName="user">
      <column name="id" value="1" />
      <column name="email" value="[email protected]" />
    </insert>
  </changeSet>
</databaseChangeLog>

Then, if you're using Spring to manage your DAO's, you could put the following in your Application Context file that you're deploying:

<bean id="liquibase" class="liquibase.spring.SpringLiquibase">
  <property name="dataSource" ref="dataSource" />
  <property name="changeLog" value="classpath:dbChangelog.xml" />
</bean>

For the Application Context file that you use in your unit tests, configure Liquibase with an additional context property:

<bean id="liquibase" class="liquibase.spring.SpringLiquibase">
  <property name="dataSource" ref="dataSource" />
  <property name="changeLog" value="classpath:dbChangelog.xml" />
  <property name="contexts" value="test" />
</bean>

This way, you can keep all of your database definitions together in one place and only insert your fixtures when you're running your test code.

like image 50
Aaron Avatar answered Oct 21 '22 14:10

Aaron


What we have found as the most manageable way to manage the gradual evolution of live/test DB Schemas is using a schema migration management tool like Liquibase

This allows us to apply the latest schema changes to whatever environment we so choose, test or otherwise, in a consistent fashion which then allows us to run whatever kind of automation we wish against the up to date schema.

like image 36
j pimmel Avatar answered Oct 21 '22 16:10

j pimmel


What I do in my tests:

  • I keep a DB version somewhere
  • In the first test, I tear down the whole DB and build it from scratch
  • I run each schema update in an individual test
  • I run the "update DB" module as an individual test (must not do anything because all changes have been applied already). Optionally, I tear down the DB again and run this once.
  • I load the test data into the DB (some of the tests above will do this if they fix data errors).

Now, the test DB is ready for the "real" (application) tests. After each of the application tests, I roll back the current transaction so the test DB never changes after setup.

To make testing faster, I usually have three test suites: One which contains on the DB setup, one which contains only the application tests and one which contains the other two suites. This allows me to quickly reset the test DB and run a single test from the app suite.

like image 2
Aaron Digulla Avatar answered Oct 21 '22 16:10

Aaron Digulla