Skip to content

Latest commit

 

History

History
42 lines (29 loc) · 1.38 KB

File metadata and controls

42 lines (29 loc) · 1.38 KB

Kafka Streams Schedule

This module streams records of type <String, KafkaUser> from the USER_TOPIC and uses the Processor API to count users by nationality. It then schedules the emission of these counts to the downstream processor based on wall clock time and stream time. It demonstrates the following:

  • How to use the Kafka Streams Processor API, including process() and schedule().
  • How to use the processor context and schedule tasks based on wall clock time and stream time.
  • How to create a timestamped key-value store.
  • Unit testing using the Topology Test Driver.

topology.png

Prerequisites

To compile and run this demo, you’ll need:

  • Java 21
  • Maven
  • Docker

Running the Application

To run the application manually:

  • Start a Confluent Platform in a Docker environment.
  • Produce records of type <String, KafkaUser> to the USER_TOPIC. You can use the Producer User for this.
  • Start the Kafka Streams application.

To run the application in Docker, use the following command:

docker-compose up -d

This command will start the following services in Docker:

  • 1 Kafka Broker (KRaft mode)
  • 1 Schema Registry
  • 1 Control Center
  • 1 Producer User
  • 1 Kafka Streams Schedule