Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Kubernetes deployment strategy using CQRS with dotnet & MongoDb

I am re-designing a dotnet backend api using the CQRS approach. This question is about how to handle the Query side in the context of a Kubernetes deployment.

I am thinking of using MongoDb as the Query Database. The app is dotnet webapi app. So what would be the best approach:

  1. Create a sidecar Pod which containerizes the dotnet app AND the MongoDb together in one pod. Scale as needed.

  2. Containerize the MongoDb in its own pod and deploy one MongoDb pod PER REGION. And then have the dotnet containers use the MongoDb pod within its own region. Scale the MongoDb by region. And the dotnet pod as needed within and between Regions.

  3. Some other approach I haven't thought of

like image 989
brando Avatar asked Nov 06 '22 01:11

brando


1 Answers

I would start with the most simple approach and that is to place the write and read side together because they belong to the same bounded context.

Then in the future if it is needed, then I would consider adding more read side or scaling out to other regions.

To get started I would also consider adding the ReadSide inside the same VM as the write side. Just to keep it simple, as getting it all up and working in production is always a big task with a lot of pitfalls.

I would consider using a Kafka like system to transport the data to the read-sides because with queues, if you later add a new or if you want to rebuild a read-side instance, then using queues might be troublesome. Here the sender will need to know what read-sides you have. With a Kafka style of integration, each "read-side" can consume the events in its own pace. You can also more easily add more read-sides later on. And the sender does not need to be aware of the receivers.

Kafka allows you to decouple the producers of data from consumers of the data, like this picture that is taken form one of my training classes: enter image description here

In kafka you have a set of producers appending data to the Kafka log:

enter image description here

Then you can have one or more consumers processing this log of events:

enter image description here

like image 126
Tore Nestenius Avatar answered Nov 15 '22 07:11

Tore Nestenius