Skip to content

Latest commit

 

History

History

kafka-streams-schedule

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 

Kafka Streams Schedule

This module demonstrates the following:

  • The use of the Processor API, including process() and .schedule().
  • The processor context and the scheduling of tasks based on wall block time and stream time.
  • The creation of a timestamped key-value store.
  • Unit testing using Topology Test Driver.

In this module, records of type <String, KafkaUser> are streamed from a topic named USER_TOPIC. The following tasks are performed:

  1. Processes the stream using a custom processor that performs the following tasks:
    • Counts the number of users by nationality.
    • Emits all counters of all nationalities every minute, based on the stream time.
    • Resets the counters every two minutes, based on the wall clock time.
  2. Writes the processed results to a new topic named USER_SCHEDULE_TOPIC.

topology.png

Prerequisites

To compile and run this demo, you will need the following:

  • Java 21
  • Maven
  • Docker

Running the Application

To run the application manually:

  • Start a Confluent Platform in a Docker environment.
  • Produce records of type <String, KafkaUser> to a topic named USER_TOPIC. You can use the producer user to do this.
  • Start the Kafka Streams application.

To run the application in Docker, use the following command:

docker-compose up -d

This command will start the following services in Docker:

  • 1 Kafka broker (KRaft mode)
  • 1 Schema registry
  • 1 Control Center
  • 1 producer User
  • 1 Kafka Streams Schedule