I do a lot of unit testing (mainly PHP/Mysql), but SQL scripts used to create the database are way too slow to run, so I waste a lot of time. I cannot always mock the database (legacy code, too complex to handle), so what I can do? Copy data Mysql files directly? Load my DB in another way?
Loading lot of data is quick, it's only the CREATE TABLES wich are slow.
Reading another question give me an answer : just move my test database to memory. Under Debian/Ubuntu (my case), I have just to move the directory of my test DB to /dev/shm, create a link to have the old directory linked to the new one, restart mysql server, and tada!!
A suite wich take 140s to run now run in 10s. So obvious to do! This project use a lot the database during tests. Another one, wich run in 18/20s with a database on disk, does not run faster with a database in memory, but it have more unit tests than the other project, and less tables to create in integration testing.
You can use a initialized (test) database. This way you only have to:
When you do this, you will need to maintain a database migration script that allows you to update the test database when the schema changes, but you are probably already doing this with your current approach, and you would need to do this any way when rolling out a new version of your software.
Doing things like this will still be quite slow, but considerably faster than running all the DDL scripts for each test. If possible, go for writing unit tests, instead of integration tests, but you already said that this is hard considering the state of the system (legacy).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With