Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Migrating Activiti tasks from old process to new process

I have an Activiti project for some business process.

The problem is about migration. An existing process has some unfinished tasks. I want to modify the existing process by adding a new step.

Now, when I create a new task, this new task will be processed according to the updated process. And unfinished tasks will be processed according to the old process.

Let's take the following example: https://spring.io/blog/2015/03/08/getting-started-with-activiti-and-spring-boot

In this example, consider the following line:

taskVariables.put("telephoneInterviewOutcome", true);

Assume that, I have some business logic code where I check the value of this variable such as:

if (taskVariables.get("telephoneInterviewOutcome") == true) {...}

Now assume that, I want to modify this variable from Boolean to Enum. Now, I need to update my business logic as well:

if (taskVariables.get("telephoneInterviewOutcome") == SOMEENUM) {...}

Now, my business logic code needs to be branched according to the version of the process of the task at hand. If the task belongs to version 1 of the process, then I will use the first statement, else the second one like that:

if (getProcessVersion(task) == 1) {
    if (taskVariables.get("telephoneInterviewOutcome") == true) {...}
} else {
    if (taskVariables.get("telephoneInterviewOutcome") == SOMEENUM) {...}
}

The problem with this approach is that the business logic code will grow as the processes are updated. This will cause lots of bugs during production.

Is there any other solution to this problem? How can I solve this problem without changing business logic code?

like image 574
Arif Acar Avatar asked Dec 10 '15 15:12

Arif Acar


2 Answers

What you describe is one of the key pain points of any Process Implementation that is long running. Many processes I have implemented live in excess of 12 months, so you always have to consider the process model evolution.

Philippe raised some good techniques to reduce the risk, but even separating business logic from integrations and externalizing decision points to a rules engine doesn't always get you where you need to be.

You're example of adding a task and changing the type of a variable are classic cases where what we like to call "in flight" processes will simply fail in the new process model if you don't branch.

Other classic examples are failure to initialize a variable that the new process needs and addition of decision logic that can never succeed for the in flight processes.

In general there are a few ways to handle process evolution:

  1. Let old processes finish their original flow and start new processes on the new flow. Obviously this only works for some cases, but is the simplest approach. Very long running processes tend to not fall in this bucket.

  2. Externalize all data to an external system of records and minimize data that is held within the process itself (keep references, not the data itself). This is an extension on the points Phillipe makes. Following this best practice means that your process logic is not as tightly bound to the data, instead it is bound simply to the decisions that influence the flow of the process. Externalize the rules and decisions and services called to make these decisions and you are much more free to modify process logic with less impact on in-flight processes.

  3. Add business logic into the process to specifically handle in-flights. This is the approach you cite in your question. It is viable, but as you say, can become a maintenance nightmare.

  4. Define "safe zones" or milestones in your process by using called sub processes with a well defined interface. This way you can change out segments of a process in a controlled manner. Obviously what happens to instances that have tokens inside a segment to be replaced? You need to either bleed these instances out over time or plan ahead and stop processes from entering a sub process if a new module for this segment is about to be deployed.

And then there are combinations of all of the above (which is what tends to happen in real life).

No easy solution, although some BPMS Systems have tooling that helps identify potential upgrade issues, but still it comes down to good architecture, planning and testing.

like image 169
Greg Harley Avatar answered Nov 01 '22 17:11

Greg Harley


I have faced the same kind of issue and, unfortunately, I don´t think there a simple answer for your problem. Nevertheless, I´ve managed to keep some sanity to my codebase by following some principles:

  1. Recognize that process definitions and business logic have different development lifecycles. Sometimes they go toghether, but usually not, specially as a process evolves in order to address new situations (a new role, a new approval step, etc)

  2. Keep "process logic" to a minimum. Implement it using scripting languages (my choice is Groovy, put you can opt for any other jvm-based scripting language). This way you can deploy it with your process and don´t have to worry about different versions.

  3. When a process needs to call a business service, do it using a technology like REST or SOAP services. An ESB can be a handy companion for your BPM server, creating a clear separation between your process and business services.

Hope that those principles help you think about your problem.

like image 45
Philippe Sevestre Avatar answered Nov 01 '22 18:11

Philippe Sevestre