Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python: how to mock a kafka topic for unit tests?

We have a message scheduler that generates a hash-key from the message attributes before placing it on a Kafka topic queue with the key.

This is done for de-duplication purposes. However, I am not sure how I could possibly test this deduplication without actually setting up a local cluster and checking that it is performing as expected.

Searching online for tools for mocking a Kafka topic queue has not helped, and I am concerned that I am perhaps thinking about this the wrong way.

Ultimately, whatever is used to mock the Kafka queue, should behave the same way as a local cluster - i.e. provide de-deuplication with Key inserts to a topic queue.

Are there any such tools?

like image 208
user1658296 Avatar asked Oct 31 '16 10:10

user1658296


People also ask

How do I test a Kafka based application?

To test Kafka-based services in ReadyAPI, you use the API Connection test step. This test step is linked to either the Publish or Subscribe Kafka operation. Depending on a base operation, the test step works as a producer or as a consumer.

How do you write a unit test for Kafka consumer?

Testing a Kafka Consumer Consuming data from Kafka consists of two main steps. Firstly, we have to subscribe to topics or assign topic partitions manually. Secondly, we poll batches of records using the poll method. The polling is usually done in an infinite loop.

What is the use of mock in unit testing python?

unittest.mock is a library for testing in Python. It allows you to replace parts of your system under test with mock objects and make assertions about how they have been used. unittest.mock provides a core Mock class removing the need to create a host of stubs throughout your test suite.


1 Answers

If you need to verify a Kafka specific feature, or implementation with a Kafka-specific feature, then the only way to do it is by using Kafka!

Does Kafka have any tests around its deduplication logic? If so, the combination of the following may be enough to mitigate your organization's perceived risks of failure:

  • unit tests of your hash logic (make sure that the same object does indeed generate the same hash)
  • Kafka topic deduplication tests (internal to Kafka project)
  • pre-flight smoke tests verifying your app's integration with Kafka

If Kafka does NOT have any sort of tests around its topic deduplication, or you are concerned about breaking changes, then it is important to have automated checks around Kafka-specific functionality. This can be done through integration tests. I have had much success recently with Docker-based integration test pipelines. After the initial legwork of creating a Kafka docker image (one is probably already available from the community), it becomes trivial to set up integration test pipelines. A pipeline could look like:

  • application-based unit tests are executed (hash logic)
  • once those pass, your CI server starts up Kafka
  • integration tests are executed, verifying that duplicate writes only emit a single message to a topic.

I think the important thing is to make sure Kafka integration tests are minimized to ONLY include tests that absolutely rely on Kafka-specific functionality. Even using docker-compose, they may be orders of magnitude slower than unit tests, ~1ms vs 1 second? Another thing to consider is the overhead of maintaining an integration pipeline may be worth the risk of trusting that Kakfa will provide the topic deduplication that it claims to.

like image 127
dm03514 Avatar answered Sep 29 '22 11:09

dm03514