(Sorry for the long post, but I guess all the information is really necessary)
We have two tables - task and subtask. Each task consists of one or more subtasks, and each of these objects has a start date, end date and duration. Additionally, subtasks have a ordering.
Tables
create table task (
pk number not null primary key,
name varchar2(30) not null,
start_date date,
duration_in_days number,
end_date date,
needs_recomputation number default 0
);
create table subtask (
pk number not null primary key,
task_fk references task(pk),
name varchar2(30) not null,
start_date date,
duration_in_days number,
end_date date,
ordering number not null
);
Business rules
start_date + duration = end_date
duration = sum(duration of subtasks)
This directly generates the following requirements for updates/deletes:
Current approach
This (kind of) works, but it has several drawbacks:
So my question is - is there any sensible alternative approach for this?
Package
create or replace package pkg_task is
g_update_in_progress boolean;
procedure recomputeDates(p_TaskID in task.pk%TYPE);
procedure recomputeAllDates;
end;
create or replace package body pkg_task is
procedure recomputeDates(p_TaskID in task.pk%TYPE) is
begin
g_update_in_progress := true;
-- update the subtasks
merge into subtask tgt
using (select pk,
start_date,
duration_in_days,
end_date,
sum(duration_in_days) over(partition by task_fk order by ordering) as cumulative_duration,
min(start_date) over(partition by task_fk) + sum(duration_in_days) over(partition by task_fk order by ordering rows between unbounded preceding and 1 preceding) as new_start_date,
min(start_date) over(partition by task_fk) + sum(duration_in_days) over(partition by task_fk order by ordering) as new_end_date
from subtask s
where s.task_fk = p_TaskID
order by task_fk,
ordering) src
on (src.pk = tgt.pk)
when matched then
update
set tgt.start_date = nvl(src.new_start_date,
src.start_date),
tgt.end_date = nvl(src.new_end_date,
src.end_date);
-- update the task
merge into task tgt
using (select p_TaskID as pk,
min(s.start_date) as new_start_date,
max(s.end_date) as new_end_date,
sum(s.duration_in_days) as new_duration
from subtask s
where s.task_fk = p_TaskID) src
on (tgt.pk = src.pk)
when matched then
update
set tgt.start_date = src.new_start_date,
tgt.end_date = src.new_end_date,
tgt.duration_in_days = src.new_duration,
tgt.needs_recomputation = 0;
g_update_in_progress := false;
end;
procedure recomputeAllDates is
begin
for cur in (select pk
from task t
where t.needs_recomputation = 1)
loop
recomputeDates(cur.pk);
end loop;
end;
begin
g_update_in_progress := false;
end;
Triggers
create or replace trigger trg_task
before update on task
for each row
begin
if (:new.start_date <> :old.start_date and not pkg_task.g_update_in_progress) then
pkg_task.g_update_in_progress := true;
-- set the start date for the first subtask
update subtask s
set s.start_date = :new.start_date
where s.task_fk = :new.pk
and s.ordering = 1;
:new.needs_recomputation := 1;
pkg_task.g_update_in_progress := false;
end if;
end;
create or replace trigger trg_subtask
before update on subtask
for each row
declare
l_date_changed boolean := false;
begin
if (not pkg_task.g_update_in_progress) then
pkg_task.g_update_in_progress := true;
if (:new.start_date <> :old.start_date) then
:new.end_date := :new.start_date + :new.duration_in_days;
l_date_changed := true;
end if;
if (:new.end_date <> :old.end_date) then
:new.duration_in_days := :new.end_date - :new.start_date;
l_date_changed := true;
end if;
if (:new.duration_in_days <> :old.duration_in_days) then
:new.end_date := :new.start_date + :new.duration_in_days;
l_date_changed := true;
end if;
if l_date_changed then
-- set the needs_recomputation flag for the parent task
-- if this is the first subtask, set the parent's start date, as well
update task t
set t.start_date =
(case
when :new.ordering = 1 then
:new.start_date
else
t.start_date
end),
t.needs_recomputation = 1
where t.pk = :new.task_fk;
end if;
pkg_task.g_update_in_progress := false;
end if;
end;
Job
begin
dbms_scheduler.create_job(
job_name => 'JOB_SYNC_TASKS'
,job_type => 'PLSQL_BLOCK'
,job_action => 'begin pkg_task.recomputeAllDates; commit; end; '
,start_date => to_timestamp_tz('2014-01-14 10:00:00 Europe/Berlin',
'yyyy-mm-dd hh24:mi:ss tzr')
,repeat_interval => 'FREQ=HOURLY;BYMINUTE=0,5,10,15,20,25,30,35,40,45,50,55'
,enabled => TRUE
,comments => 'Task sync job, runs every 5 minutes');
end;
Using triggers here is just asking for trouble.
Furthermore, the choice of using the scheduler is probably not the best idea since the scheduled jobs can only see committed data. So either you're committing in the trigger which throws transaction logic out of the window or the changes to the tables are delayed until the end of the transaction.
You should either:
Use procedures. The simplest answer. When you have multiple applications they should not perform DML/businees logic directly, they should always do it with procedures so that they all run the same code. Forbid direct DML with grants or views. You may need to force the use of procedures through INSTEAD OF
triggers on views (consider this only if you can't modify the application).
Probably even better than procedures in your case: use a schema that doesn't contain duplicate data. You don't want to store redundant data: this makes application development more complex than needed. In terms of performace, resources and energy, the best way to solve a problem is when you realize that the task is unnecessary.
From the description of your model, here are the columns that you could remove:
The task
table would contain the start date only, and each subtask would only store its duration. When you need the aggregate information, use joins. You can use views to let the applications access the data transparently.
Use a mutating trigger workaround that uses package variables to identify modified rows with BEFORE
and AFTER
statement triggers. Obviously this will involve lots of code that will be hard to code, test and maintain so you should use options (1) and (2) whenever possible instead.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With