The application creates test data in Kafka
This project was generated with Python 3 and Docker.
- Setup Kafka Server as needed - Kafka Quick Setup
- Obtain application server - can use AWS - Can share server with Kafka
- In AWS add the application server to the Security Group's Inbound Rules for the Kafka Server
- Install Docker -
sudo apt install docker.io docker-compose nmon kafkacat -y
- Add Docker Group to User
sudo usermod -a -G docker ubuntu
- Relogin for the user to gain access to Docker.
- Make a local copy of the application code found on GitHub by
git clone https://github.com/JohnRTurner/phonedatagen.git
- Build the Docker image
docker build phonedatagen -t phonedatagen
- Run the Image
docker run -d --name phonedatagen -e KAFKA_SERVER=$(hostname):29092 -e BATCH_SIZE=1000 -e KAFKA_TOPIC=test -e PROC_COUNT=8 -t phonedatagen
- View the logs
docker logs -f phonedatagen
- Proceed to loading the data SingleStore Setup
Option | Description |
---|---|
BATCH_SIZE | Batch Size |
KAFKA_TOPIC | Kafka Topic Name -Will Create |
PROC_COUNT | Processes to Concurrently Run |
KAFKA_SERVER | Kakfka Server |
Can view the code on GitHub
Filename | Description |
---|---|
main.py | Main module takes parameters and runs generator |
datagenerators.py | Creates data and sends to Kafka |
kafka.py | wrapper for Kafka calls |
README.md | This file |
.dockerignore | Files not to copy to the repository |
Dockerfile | File to generate docker image |
requirements.txt | Python library requirements |
kafkasetup/README.md | Instructions to setup Kafka docker |
kafkasetup/docker-compose.yml | Sample docker-compose.yml |
singlestoresetup/README.md | Instructions to setup SingleStore with Pipelines |