This module demonstrates the following:
- The use of the Kafka Streams DSL, including
process()
. - Accessing and enriching records with metadata using the processor context.
- Unit testing with the Topology Test Driver.
In this module, records of type <String, KafkaUser>
are streamed from a topic named USER_TOPIC
.
The following tasks are performed:
- Each record is processed using a custom processor that enriches the value with metadata such as the topic, partition, and offset information. The processor also changes the key of the record by using the last name.
- The processed records with enriched metadata are written to a new topic named
USER_PROCESS_TOPIC
.
To compile and run this demo, you will need the following:
- Java 21
- Maven
- Docker
To run the application manually:
- Start a Confluent Platform in a Docker environment.
- Produce records of type
<String, KafkaUser>
to a topic namedUSER_TOPIC
. You can use the producer user to do this. - Start the Kafka Streams application.
To run the application in Docker, use the following command:
docker-compose up -d
This command will start the following services in Docker:
- 1 Kafka broker (KRaft mode)
- 1 Schema registry
- 1 Control Center
- 1 producer User
- 1 Kafka Streams Process