Skip to content

chipkarsaish/Fraud_Detection_Collage_Mini_Project_SemV

Repository files navigation

Apache Kafka Setup with Docker + Python Producer & Consumer

This project sets up Apache Kafka using Docker containers and demonstrates a Producer-Consumer workflow in Python. The producer (producer.py) generates transaction data, while the consumer (app.py) consumes the data from a Kafka topic.


📌 Prerequisites

Before you start, ensure you have the following installed:

pip install kafka-python

📂 Project Structure

.
├── docker-compose.yml   # Docker Compose file for Kafka + ZooKeeper
├── producer.py          # Kafka producer that generates transactions
├── app.py               # Kafka consumer that reads transactions
└── README.md            # Setup guide

🚀 Kafka Setup Instructions

1. Navigate to Project Directory

Make sure you are in the folder that contains the docker-compose.yml file:

cd /path/to/your/project

2. Start Kafka & ZooKeeper Containers

docker-compose up -d
  • -d runs the containers in the background.

  • This starts:

    • ZooKeeper (required by Kafka)
    • Kafka Broker

3. Verify Running Containers

docker ps

You should see containers for kafka and zookeeper.


📡 Kafka Topic Setup

1. Access Kafka Container

docker exec -it <kafka-container-name> bash

2. Create a Topic

bin/kafka-topics.sh --create \
  --topic transactions \
  --bootstrap-server localhost:9092 \
  --partitions 1 \
  --replication-factor 1

3. List Topics

bin/kafka-topics.sh --list --bootstrap-server localhost:9092

You should see transactions in the list.


📝 Running Producer & Consumer

1. Run Producer (producer.py)

This script generates transaction data and sends it to the Kafka topic transactions.

python producer.py

You should see messages being sent to Kafka.


2. Run Consumer (app.py)

This script consumes messages from the transactions topic and displays/uses them.

python app.py

You should see messages being received in real time.


🛑 Stopping Services

Stop Kafka and ZooKeeper:

docker-compose down

Restart later with:

docker-compose up -d

⚙️ Notes

  • Kafka default ports:

    • 9092 → Kafka broker
    • 2181 → ZooKeeper
  • Both producer.py and app.py are configured to use the topic transactions. If you change the topic name or Kafka connection settings, update the scripts accordingly.

  • Logs can be checked using:

docker logs <kafka-container-name>

✅ With this setup, you can:

  • Run Kafka & ZooKeeper inside Docker
  • Produce transaction data via producer.py
  • Consume transaction data via app.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published