Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Stategies for coping with schema evolution? [closed]

Currently we're using hand-rolled SQL in Data-Access objects and a lot of stored-procedures and triggers which amount to around 20k lines of code. We're finding that simple changes are causing a couple of days' work to fix, and its causing deadlines to slip.

Changes include modifications to tables to cope with additional data, general refactoring of the schema based on QA/user reports, etc. Its a very active system that's being built to replace something old and slow.

We looked at the PHP ORM solutions available to try and limit the effects of these changes, but they were just too slow to cope with our schema; "simple" sql results were taking orders of magnitude longer to return than our custom queries and caused page views of ~.5s to take over 20s.

What best-practices/strategies could I look into to cope with schema evolution with relational databases, in a general context?

Edit: forgot to mention about the triggers; we have a lot of data which relies on cascading changes, eg. a price change here for this user updates a price there for that user, etc.

like image 385
Phillip B Oldham Avatar asked Nov 20 '08 10:11

Phillip B Oldham


2 Answers

You might want to checkout this book on Refactoring Databases: Evolutionary Database Design.

like image 150
Jeffrey Fredrick Avatar answered Sep 18 '22 14:09

Jeffrey Fredrick


I suggest using a continuous (or at least nightly) build strategy.
Rebuild the database on every checkin, or at least once per day.
Also once per day, run unit tests to exercise each bit of code, be it in a stored procedur, a trigger or a data access layer.

There is a great cost to writing stored procs, but this will identify breaks immediately.
Once you know where the break is, you can fix it.

I'd be interested to hear other people's experiences with this strategy applied to database changes.

like image 20
AJ. Avatar answered Sep 22 '22 14:09

AJ.