I would like to have a CI build (e.g., Hudson) set up and tear down an Oracle 11g schema as part of a nightly build/test cycle for a fairly vanilla JSF/JPA application.
The most obvious way to do this is by dropping and re-creating all tables. While this feels fairly standard (at least, that's what the Hibernate/JPA tools would do automatically for you), I've had Oracle DBAs warn me that the Oracle catalog will get fragmented after repeated object create/drop cycles. Eventually this will cause performance problems because the SYSTEM tablespace cannot be defragmented/coalesced.
My questions are:
Thanks!
Don't believe those DBAs
At least with 10g and above when using locally managed tablespaces (LMT) this should not be a problem.
And even if that did cause any fragmentation I very much doubt that you could measure its impact - especially on a database that is used for CI.
I am in the process of putting a CI build process in place for my 2nd Oracle project. I don't think dropping and recreating everything will do any harm (as a_horse_with_no_name stated above). I am glad to hear you are thinking of extending CI to the database objects - too many teams don't.
A different approach could be to restore the database from a recent backup each night (or use flashback database) and migrate your application from 'production backup' to current dev state on each CI run. In that way, the code that is eventually going to be applied to production will be tested each night against something that is largely identical to production. It is a bit of a change in thinking, but not too much of a change if you are already thinking of CI.
If you fancy trying the migration approach, I have a tool I have been working on that may help - http://dbgeni.com It is still very much under development, but I have designed it with CI and managing database changes with migrations in mind.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With