
Building Microservices with Node.js and Kafka: A Step-by-Step Guide

This tutorial demonstrates how to build a robust and scalable microservices architecture using Node.js and Apache Kafka. We'll create two simple microservices: one that produces messages to a Kafka topic and another that consumes them. This pattern is fundamental to many real-world applications, allowing for asynchronous communication and decoupled services.
Prerequisites & Setup
Before diving in, make sure you have the following installed:
Node.js and npm (or yarn)
Docker and Docker Compose (recommended for running Kafka)
We'll use Docker Compose to simplify setting up a Kafka environment. Create a file named docker-compose.yml
with the following content:
version: '3.8'
services:
zookeeper:
image: confluentinc/cp-zookeeper:latest
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
kafka:
image: confluentinc/cp-kafka:latest
depends_on:
- zookeeper
ports:
- "9092:9092"
environment:
KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_CREATE_TOPICS: "AUTO.CREATE.TOPICS.ENABLE=true"
Start Kafka using: docker-compose up -d
Step 1: The Producer Microservice
Create a file named producer.js
:
const { Kafka } = require('kafkajs');
const kafka = new Kafka({
clientId: 'my-producer',
brokers: ['kafka:9092'],
});
const producer = kafka.producer();
async function sendMessage() {
await producer.connect();
await producer.send({
topic: 'my-topic',
messages: [{ value: 'Hello from Node.js producer!' }],
});
await producer.disconnect();
}
sendMessage().catch(console.error);
This code initializes a Kafka producer, connects to the broker, sends a message to the 'my-topic' topic, and then disconnects.
Explanation:
kafkajs
is the Node.js client for Apache Kafka.We define the Kafka brokers' address (from the
docker-compose
file).producer.connect()
establishes a connection to the Kafka broker.producer.send()
sends the message.producer.disconnect()
closes the connection gracefully.
Step 2: The Consumer Microservice
Create a file named consumer.js
:
const { Kafka } = require('kafkajs');
const kafka = new Kafka({
clientId: 'my-consumer',
brokers: ['kafka:9092'],
});
const consumer = kafka.consumer({ groupId: 'my-group' });
async function consumeMessages() {
await consumer.connect();
await consumer.subscribe({ topic: 'my-topic', fromBeginning: true });
await consumer.run({
eachMessage: async ({ topic, partition, message }) => {
console.log({
value: message.value.toString(),
});
},
});
}
consumeMessages().catch(console.error);
This code initializes a Kafka consumer, subscribes to 'my-topic', and logs any received messages.
Explanation:
A consumer group (
my-group
) allows multiple consumers to share the workload.fromBeginning: true
ensures the consumer reads all messages from the beginning of the topic.consumer.run()
starts consuming messages.
Step 3: Running the Microservices
Open two terminal windows. In the first, run node producer.js
. In the second, run node consumer.js
. You should see the message "Hello from Node.js producer!" printed in the consumer's console.
Next Steps
This example provides a basic foundation. You can expand on this by:
Implementing more complex message schemas using Avro or Protobuf.
Introducing error handling and retries.
Exploring different Kafka configurations for performance tuning.
Building a complete application with multiple microservices interacting via Kafka.
This tutorial provides a practical starting point for leveraging the power of Node.js and Kafka to build scalable and distributed applications. By understanding these core concepts, you can build robust and efficient systems ready to handle real-world challenges.
Follow Minifyn:
Try our URL shortener: minifyn.com