Skip to content

Project created with the aim of sharing knowledge with a development team.

Notifications You must be signed in to change notification settings

Vicente-Torres/spring-kafka

Repository files navigation

spring-kafka

Project

Overview

This project initially was created with the aim of sharing knowledge with a development team. Basically was created a docker-compose for creating and up the kafka brokers and the kafka consumers. The kafka-consumers are made using Spring. To producer messages was used the Jmeter, but you can produce using another tool as the Magic Kafka.

How to run?

To run and see the project working you need to fallow some steps:

Creating consumer image for docker

You will need create the docker image use by kafka consumers services. The image is generated based on a simple Spring application. To generate the image, follow these the steps:

  • Into 'spring-consumer' directory run './mvnw clean package install';
  • Then, in the same directory, execute 'docker build -t springio/spring-consumer . '.

Up docker-compose

Change to 'compose' directory and run 'docker-compose up -d'. All services in the docker-compose file will be start. If for any reason the kafka brokers is down after this, you can just run 'docker start '.
In the tests you can stop and restart the spring-consumer's or kafka's container to see the rebalancing going on.

Producing messages

To see all the magic happening you need send messages to the kafka topic. For do that the Jmeter could be the best option. But you can do using another toll as Magic Kafka. However, the last one is not free but has a demo license with time expiration.

Jmeter

The Apache Jmeter should be installed. After this, also install the pepper-box lib. Then go to File>Open and select the "kafka-producer-jmeter-profile.jmx" file, available in this repository. Has used the version 5.5 of Jmeter.
In the Message section hve the message to be sent to kafka topic. This message use a custom function from pepper-box lib. The function has created only with the goal to get the current time. You can use on this way or change the custom function to a fixed value.
To use the custom function you should create it in the CustomFunctions class

Some test ideas

  • You can see the spring-consumer containers logs and observes how they behave when the message throughput is increase;
  • Can observe how another consumers behave when some consumers and/or kafka brokers are stopped;
  • Can use some commands into kafka broker to understand the kafka cluster better. Some usefully commands:
    • docker exec -id <container_name> bash -> access the bash of kafka container;
    • kafka-topics --create --topic=example --bootstrap-server=localhost:9092 --partitions=3 -> create a topic;
    • kafka-topics --list --bootstrap-server=localhost:9092 -> list all topics;
    • kafka-console-producer --topic=example --bootstrap-server=localhost:9092 -> get in on console to producer messages;
    • kafka-console-consumer --topic=example --bootstrap-server=localhost:9092 -> get in on console to consumer messages;
    • kafka-topics --describe --bootstrap-server=localhost:9092 --topic=example -> show the details of the topic;
    • kafka-consumer-groups --group test --describe --bootstrap-serve localhost:19092 -> describe the consumers of the group;

About

Project created with the aim of sharing knowledge with a development team.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published