An example project using Akka Streams with Kafka and a Schema Registry.
It uses the Alpakka Kafka Connector to write messages in Avro format.
The Alpakka Kafka Connector documentation provides clear examples on how to use Akka Streams with Kafka. It doesn't provide an example on how to use Akka Streams with Avro and a Schema Registry.
This project provides an example on how to do this.
Kafka itself is agnostic in regard to the message format, but Avro with a Schema Registry is the preferred solution. A message format with a schema like e.g. Avro and a Schema Registry or Protocol buffers compared to a message format without a schema like JSON has the advantage that it is known what the message contains. Also backwards compatiblity can be maintained when evolving the schema.
Using Avro with a Schema Registry has the advantage that the schema definition doesn't has to be added to every single message, but the schema is registered in the Schema Registry and a reference to the schema is stored in the messages.
avro4s is used in this example project to automatically generate the schema from the Scala case class and to serialize and deserialize the case class.
To run the project, you need to have Docker Compose installed. In the root directory of the project you do a docker-compose up
and wait
until Zookeeper, Kafka and the Schema Registry have been started. Do a docker ps
to verify that you have a running Zookeeper, Kafka,
and Schema Registry.
After that you can start the producer with sbt "runMain KafkaProducer"
and the consumer with sbt "runMain KafkaConsumer"
.
The sample producer KafkaProducer
will create a few instances of SampleEvent
and serialize them in Avro format and send them to the
Kafka topic mytopic
.
The sample consumer KafkaConsumer
will read from the topic mytopic
and deserialize the messages into instances of SampleEvent and log these instances.
In this example project the consumer will automatically stop after 30 seconds.