Loading
Amr Ali Eissa

Sr. Data Engineer

Sr. Software Engineer

Amr Ali Eissa

Sr. Data Engineer

Sr. Software Engineer

Blog Post

How to Set Up a Simple Kafka Consumer in Java

February 24, 2025 Apache Kafka, Kafka Consumer
How to Set Up a Simple Kafka Consumer in Java

In this post, we’ll walk through creating a simple Apache Kafka consumer in Java and using it to read messages from a Kafka topic. By the end, you’ll have a clear, step-by-step guide on how to set up your consumer and verify that it’s receiving messages—essential for any Kafka or stream processing application.

1. Create a Kafka Topic

  1. Open your Confluent Control Center (installed in a previous post).
  2. From the Control Center home page, click on Topics.
  3. Select + Add topic.
  4. Name your topic (e.g., test-topic) and click Create with defaults.

Your topic is now ready—currently with no messages in it.

2. Set Up a New Java Maven Project

Create a blank Java Maven project using your favorite IDE (e.g., IntelliJ IDEA or Eclipse).

Add the Kafka Clients Dependency

In your project’s pom.xml, add the following dependency inside the <dependencies> section:

				
					<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>3.8.0</version>
</dependency>
				
			

This pulls in the Kafka libraries needed for our consumer.

3. Define Consumer Properties

Inside your main class, create a helper method to load basic consumer settings:

				
					private static Properties getConsumerProperties() {
    Properties props = new Properties();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
    props.put(ConsumerConfig.GROUP_ID_CONFIG, "test-group");
    return props;
}
				
			

Here’s what each property does:

  • BOOTSTRAP_SERVERS_CONFIG: Points to the Kafka broker(s) (in this case, running on your local machine).
  • KEY_DESERIALIZER_CLASS_CONFIG and VALUE_DESERIALIZER_CLASS_CONFIG: Tell the consumer how to deserialize message keys and values.
  • GROUP_ID_CONFIG: Assigns the consumer to a group. Consumers with the same group ID balance the topic’s partitions among themselves.

4. Write the Main Method

Next, modify your main method to subscribe to the topic and poll for messages:

				
					public static void main(String[] args) {
    try (Consumer<String, String> consumer = new KafkaConsumer<>(getConsumerProperties())) {
        // Subscribe to your topic (using a regex pattern for flexibility)
        consumer.subscribe(Pattern.compile("test-topic"));
        
        System.out.println("Listening for new messages...");

        // Poll the topic for up to 2 minutes
        ConsumerRecords<String, String> consumerRecords = consumer.poll(Duration.ofMinutes(2));

        // Print any received records
        consumerRecords.forEach(record -> 
            System.out.println(record.key() + " : " + record.value())
        );
    } catch (Exception e) {
        System.out.println("Something went wrong: " + e.getMessage());
    }
}
				
			
  • consumer.subscribe(…): Subscribes this consumer to the topic (here we use Pattern.compile to allow matching multiple topics by regex if desired).
  • consumer.poll(…): Waits for new messages. Once messages are received, they’re returned as a ConsumerRecords collection.

5. Produce a Test Message

  1. Keep your Java consumer running so you can see the output in real-time.
  2. In Confluent Control Center, open the Messages tab under your topic.
  3. Click + Produce a new message to this topic.
  4. Enter a key/value pair (e.g., {"value":"test1"} for the value and 1 for the key).
  5. Press Produce.

You should see the consumer print something like:

				
					Listening for new messages...
1 : {"value":"test1"}
				
			

That’s it! You have a fully functioning, simple Kafka consumer in Java.

Conclusion and Next Steps

Congratulations on consuming your first Kafka message in Java. In upcoming posts, we’ll cover more advanced Kafka consumer and producer scenarios. For now, you’ve laid the groundwork for connecting to Kafka, subscribing to topics, and reading real-time messages in your applications.

Stay tuned for more Kafka tutorials, where we’ll explore topics like message partitioning, consumer offsets, and advanced configurations to handle large-scale data streams. Happy coding!

Tags: