Skip to main content

Kafka

Apache Kafka is an open-source distributed event streaming platform. In SAMO, Kafka works as a message broker between SAMO services and Logstash.

I. Kafka with Docker

Install Kafka

Add the kafka service to docker-compose.yml:

kafka:
image: docker.io/bitnami/kafka:3.7
restart: always
environment:
- KAFKA_ENABLE_KRAFT=yes
- KAFKA_CFG_PROCESS_ROLES=broker,controller
- KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
- KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,CONTROLLER://:9093,EXTERNAL://:9094
- KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT,EXTERNAL:PLAINTEXT
- KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://:9092,EXTERNAL://:9094
- KAFKA_BROKER_ID=1
- KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=1@:9093
- ALLOW_PLAINTEXT_LISTENER=yes
- KAFKA_CFG_NODE_ID=1
- KAFKA_AUTO_CREATE_TOPICS_ENABLE=true
ports:
- "${kafka_port}:9092"
volumes:
- ${data_dir}/kafka:/bitnami

Install Kafka UI

Kafka UI is a versatile, fast, and lightweight web UI for managing Apache Kafka clusters.

Add the kafka-ui service to docker-compose.yml:

kafka-ui:
image: provectuslabs/kafka-ui:v0.7.2
restart: always
ports:
- "${kafka_ui_port}:8080"
environment:
- DYNAMIC_CONFIG_ENABLED: 'true'
- LOGGING_LEVEL_ROOT: 'DEBUG'
- KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: kafka:9094
depends_on:
- kafka
info

KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS points to the Kafka instance through the external port (9094).

II. Kafka on Ubuntu Server

1. Install Java

sudo apt update && sudo apt install openjdk-17-jdk -y
info

Java 11 or above is required for Kafka.

2. Download Apache Kafka

wget https://downloads.apache.org/kafka/3.7.0/kafka_2.13-3.7.0.tgz
tar -xvzf kafka_2.13-3.7.0.tgz

3. Configure Kafka with KRaft

Open the KRaft configuration file:

nano /opt/kafka/config/kraft/server.properties

Find and update the following settings:

process.roles=controller,broker
node.id=1
controller.quorum.voters=1@localhost:9093
listeners=PLAINTEXT://:9092,CONTROLLER://:9093,EXTERNAL://:9094
advertised.listeners=PLAINTEXT://localhost:9092,EXTERNAL://localhost:9094
listener.security.protocol.map=CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT,EXTERNAL:PLAINTEXT
log.dirs=/opt/kafka/logs

4. Format Kafka Storage

Use the generated UUID to format Kafka's storage:

/opt/kafka/bin/kafka-storage.sh format -t 9OYVwS2kT0KnuFU46oCFfQ -c /opt/kafka/config/kraft/server.properties
warning

Replace the UUID (9OYVwS2kT0KnuFU46oCFfQ) with a unique value generated for your installation. You can generate a new UUID using /opt/kafka/bin/kafka-storage.sh random-uuid.

5. Create a Systemd Service for Kafka

Create a new systemd service file for Kafka:

sudo nano /etc/systemd/system/kafka.service

Add the following content:

[Unit]
Description=Apache Kafka Server
After=network.target

[Service]
User=$USER
ExecStart=/opt/kafka/bin/kafka-server-start.sh /opt/kafka/config/kraft/server.properties
ExecStop=/opt/kafka/bin/kafka-server-stop.sh
Restart=on-abnormal

[Install]
WantedBy=multi-user.target

Save the file and reload the systemd daemon:

sudo systemctl daemon-reload

6. Start Kafka and Verify Status

Start Kafka and enable it to start on boot:

sudo systemctl enable kafka
sudo systemctl start kafka
tip

Use sudo systemctl status kafka to verify that Kafka is running correctly.