Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to prepare for integration tests which use PostgreSQL's in memory replacement?

I have learned that using an actual database in integration tests slows them down significantly. So, I have to use an in memory database which may significantly increase speed of my integration tests.

I'm using Springboot for application development. How do I configure PostgreSQL for testing purposes? Is there any in memory database which is highly compatible with PostgreSQL's syntax?

If there is none, how should I perform integration tests.

like image 982
Tarun Maganti Avatar asked Apr 19 '17 11:04

Tarun Maganti


3 Answers

You can actually get a real Postgres to perform quiet well in a testing environment.

I would also suggest you use a dockerized database, but use tmpfs to memory-map the data folder:

docker run --name postgres95 -p 5432:5432 --tmpfs /var/lib/postgresql/data:rw -e POSTGRES_PASSWORD=admin -d postgres:9.5.6

This is as close to "in-memory" that you can get using the real thing.

I believe that one of the main problems with slow integration tests is not the performance of the database itself, but the time it takes to set it up for each test.

I wrote a little library to help you quickly restore a database to a 'clean' state. This way you only need to run costly database migrations once, and then you can quickly restore the database for each test.

We used it in a productive system to get a 4x speedup in our integration tests:

https://github.com/ayedo/postgres-db-restore

like image 128
Ynv Avatar answered Nov 09 '22 23:11

Ynv


Some of my db tests on real postgres take 10ms each. and i do multiple commits in each test. so:

To have coverage of postgres-native features you need the same db (as you noticed, h2 and other in-memory db are not very compatible). postgres doesn't have in-memory mode. For functional tests the real database itself is not much slower than any in-memory databases. The difference usually lies in startup time (for postgres 9.6 it's ~4s). But if your testing lifecycle is smart and you can lower number of db starts to 1 or 0 (by having development db always ready), then the problem stops being noticeable.

so get the real postgres and setup its lifecycle correctly. there are some tools that can help you solve some of the problems:

  1. testcontainers will help you provide real db.

  2. dbunit - will help you clean the data between tests

    cons:

    • a lot of work is required to create and maintain schema and data. especially when your project is in a intensive development stage.
    • it's another abstraction layer so if suddenly you want to use some db feature that is unsupported by this tool, it may be difficult to test it
  3. testegration - intents to provide you full, ready to use and extensible lifecycle (disclosure: i'm a creator).

    cons:

    • free only for small projects
    • very young project

another step would be to move db to memory on the OS level. again, first startup time would be similar as everything needs to be loaded. some starting points here and here

cons:

  • every dev in your team have to modify his local environment
  • not portable between OSes (if your team have heterogenic environments)
like image 23
piotrek Avatar answered Nov 10 '22 00:11

piotrek


This question asks for opinions, but here goes:

If you want to test an application that will use PostgreSQL, you will have to use PostgreSQL for your tests. SQL dialects and behaviour just vary too much between different database management systems.

You can make PostgreSQL quite fast if you use a database that is small enough to fit into RAM, which should be possible for integration tests that just target the functionality, not the overall performance.

like image 23
Laurenz Albe Avatar answered Nov 10 '22 00:11

Laurenz Albe