We have vanilla apache Kafka setup in current infrastructure and we started logging some data that we want to process using Kafka Connect. Currently we use Avro for our message format, but there's no Schema Registry in our infrastructure. In future, we plan to replace current stack with Confluent and use Schema Registry and Connect, but for some time we need to deploy only Connect for that.
Is it possible to configure Connect sinks somehow so they use explicit avsc files or schema without connecting to Schema Registry and without using Confluent format with magic bytes and schema ID?
You can then use this Avro schema, for example, to serialize a Java object (POJO) into bytes, and deserialize these bytes back into the Java object. Avro not only requires a schema during data serialization, but also during data deserialization.
Spring framework has great support for testing your Spring application with Apache Kafka. You add spring-kafka-dependency in your maven pom. xml file and you annotate your test class with @EmbbededKafka and Spring will do the rest.
Confluent Schema RegistryAlthough Schema Registry is not a required service for Kafka Connect, it enables you to easily use Avro, Protobuf, and JSON Schema as common data formats for the Kafka records that connectors read from and write to.
No, Confluent Schema Registry is not required to produce/consume Apache AVRO records in the key or value of a Kafka record. Apache AVRO is a self-contained data container format, where a payload is always accompanied by its schema.
Yes, it is possible using the registryless-avro-converter on Github.
Follow the build instructions there, add a JAR to your plugin.path
folder as other connectors are loaded, then setup like so
key.converter=me.frmr.kafka.connect.RegistrylessAvroConverter
key.converter.schema.path=/path/to/schema/file.avsc
value.converter=me.frmr.kafka.connect.RegistrylessAvroConverter
value.converter.schema.path=/path/to/schema/file.avsc
Note that this will require you to store/maintain/sync the schema files on all Connect workers, however
Alternatively, you can setup the Schema Registry with your vanilla Kafka - No reason to do some "Confluent migration" since the registry doesn't require any infrastructure changes other than your Serializer & Deserializer configs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With