Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I assure consistency when using an event-carried state transfer approach in Kafka

Let's suppose a simplified scenario like this:

  1. There are two Kafka topics, users and orders and three microservices user-service, order-service and shipping-service.

  2. When an order is placed through the order service, an OrderCreated event is added to the orders topic and listened by the shipping service. This service needs to get the user information to send the order. According to my requirements I can't make a REST call to user-service but use a stateful approach. That is to say, the shipping service is a Kafka Streams application that listens to the users topic, having a KTable backed by a local store with the full user table information. Thus, when processing the order it already has the user information available locally.

However, one concern of this approach is the consistency of the local user information in the shipping service, e.g:

  1. A user updates its shipping address in the user-service, it updates its local SQL database and publishes an event in the user topic with this change.

  2. The user places an order, so order-service publishes it in the order topic.

  3. For whatever reason shipping service could process the OrderCreated event from order topic before reading the UserUpdated information from the user topic so it would use an address which is not valid anymore.

How could I guarantee that the shipping service always has an updated user information in this event-carried state transfer scenario?

like image 489
codependent Avatar asked Mar 29 '19 22:03

codependent


People also ask

How does Kafka maintain consistency?

Apache Kafka achieving Consistency A stricter guarantee is “exactly-once” delivery in Kafka, which guarantees that all messages will be delivered only one time. Distributed event processing systems can use Kafka's “exactly-once” delivery to assure that the system's property of eventual consistency will be preserved.

Is Kafka event driven or message driven?

Kafka provides a scalable hybrid approach that incorporates both Processing and Messaging. Another advantage of using Kafka Event Driven Architecture is that, unlike messaging-oriented systems, events published in Kafka are not removed as soon as they are consumed.


1 Answers

If you need ordering guarantees, you would need to write both the user information update as well as the order into the same topic (and in particular into the same partition) because Kafka only guarantees order within a single partition.

You could call this topic "user_action" with a unique user-id as key (both an user information update as well as an user order is an user action). In your case, all three services would consume the "user_action" topic. While the user service only considers user updates and the order service only considers orders, the shipping service considers both.

This blog post might help, too: https://www.confluent.io/blog/put-several-event-types-kafka-topic/

like image 153
Matthias J. Sax Avatar answered Oct 16 '22 17:10

Matthias J. Sax