Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How should I organize my master ddl script

I am currently creating a master ddl for our database. Historically we have used backup/restore to version our database, and not maintained any ddl scripts. The schema is quite large.

My current thinking:

  • Break script into parts (possibly in separate scripts):

    1. table creation
    2. add indexes
    3. add triggers
    4. add constraints
  • Each script would get called by the master script.

  • I might need a script to drop constraints temporarily for testing
  • There may be orphaned tables in the schema, I plan to identify suspect tables.

Any other advice?

Edit: Also if anyone knows good tools to automate part of the process, we're using MS SQL 2000 (old, I know).

like image 675
Justin Standard Avatar asked Aug 07 '08 17:08

Justin Standard


People also ask

What is a DDL script?

Data Definition Language (DDL) is a subset of SQL. It is a language for describing data and its relationships in a database. You can generate DDL in a script for database objects to: Keep a snapshot of the database structure. Set up a test system where the database acts like the production system but contains no data.


4 Answers

I think the basic idea is good.

The nice thing about building all the tables first and then building all the constraints, is that the tables can be created in any order. When I've done this I had one file per table, which I put in a directory called "Tables" and then a script which executed all the files in that directory. Likewise I had a folder for constraint scripts (which did foreign key and indexes too), which were executed when after the tables were built.

I would separate the build of the triggers and stored procedures, and run these last. The point about these is they can be run and re-run on the database without affecting the data. This means you can treat them just like ordinary code. You should include "if exists...drop" statements at the beginning of each trigger and procedure script, to make them re-runnable.

So the order would be

  1. table creation
  2. add indexes
  3. add constraints

Then

  1. add triggers
  2. add stored procedures

On my current project we are using MSBuild to run the scripts. There are some extension targets that you can get for it which allow you to call sql scripts. In the past I have used perl which was fine too (and batch files...which I would not recommend - the're too limited).

like image 125
Mat Roberts Avatar answered Oct 20 '22 13:10

Mat Roberts


@Adam

Or how about just by domain -- a useful grouping of related tables in the same file, but separate from the rest?

Only problem is if some domains (in this somewhat legacy system) are tightly coupled. Plus you have to maintain the dependencies between your different sub-scripts.

like image 40
Justin Standard Avatar answered Oct 20 '22 14:10

Justin Standard


If you are looking for an automation tool, I have often worked with EMS SQLManager, which allows you to generate automatically a ddl script from a database.

Data inserts in reference tables might be mandatory before putting your database on line. This can even be considered as part of the ddl script. EMS can also generate scripts for data inserts from existing databases.

Need for indexes might not be properly estimated at the ddl stage. You will just need to declare them for primary/foreign keys. Other indexes should be created later, once views and queries have been defined

like image 1
Philippe Grondier Avatar answered Oct 20 '22 13:10

Philippe Grondier


What you have there seems to be pretty good. My company has on occasion, for large enough databases, broken it down even further, perhaps to the individual object level. In this way each table/index/... has its own file. Can be useful, can be overkill. Really depends on how you are using it.

@Justin

By domain is mostly always sufficient. I agree that there are some complexities to deal with when doing it this way, but that should be easy enough to handle.

I think this method provides a little more seperation (which in a large database you will come to appreciate) while still making itself pretty manageable. We also write Perl scripts that do a lot of the processing of these DDL files, so that might be an option of a good way to handle that.

like image 1
Adam Lerman Avatar answered Oct 20 '22 12:10

Adam Lerman