I can think of two places where to put the domain logic in an event sourced system either one has a drawback.
Example: Calculating a metric from some data.
Either I have to calculate the metric twice (one time in the domain model, one time in the projection)
or I have to calculate it before sending the event and including it there.
The flow of control is usually like this:
Aggregate methods must ensure that their parameters and the aggregate state mutually allow the operation to be performed.
Aggregate method then creates an event and calls this When
or Apply
method to handle the event
The event handler only mutates the aggregate state, no logic there!
Further actions are related to projections.
The reason to put invariant protection aka business logic into aggregate events, before applying events is because when event is generated there is no turning back. This thing has already happened. You cannot deny applying an event. Think about replaying events when recovering the aggregate from the event stream (reading from repository), how would this possibly work if one day you decide to have an if-throw
combination there?
So, in short:
No one ever said that event sourcing would help you fixing issues in your calculations. To make an extra safety net you might want to save commands but then you will have to issue compensating events or truncate streams, which is not really what you would want to do.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With