[![Contributors][contributors-shield]][https://github.com/Berko01] [![LinkedIn][linkedin-shield]][https://www.linkedin.com/in/berkindundar/]
Table of Contents
This project sets up an ELK (Elasticsearch, Logstash, Kibana) stack using Docker Compose. It includes a custom log generator service that produces log data and feeds it into Logstash, which then indexes the data in Elasticsearch. Kibana is used to visualize the log data stored in Elasticsearch.
-
Elasticsearch:
- Stores and indexes log data.
- Runs on port 9200 (HTTP) and 9300 (TCP transport).
- Configuration ensures it runs as a single-node cluster and disables security features for simplicity.
-
Logstash:
- Processes log data from the log generator and forwards it to Elasticsearch.
- Runs on port 9600.
- Configuration file (
logstash.conf
) specifies the input log file and the Elasticsearch output.
-
Kibana:
- Provides a web interface to visualize and explore log data.
- Runs on port 5601.
- Connects to Elasticsearch to fetch and display data.
-
Log Generator:
- Generates log data and writes it to a log file.
- Log data is read by Logstash and sent to Elasticsearch.
This project leverages the following major frameworks and libraries to set up and manage the ELK stack and the custom log generator:
- Docker: Used to containerize the various services (Elasticsearch, Logstash, Kibana, and the log generator). Docker ensures that each service runs in an isolated environment, making the setup process straightforward and consistent across different environments.
- Docker Compose: Simplifies the process of managing multi-container Docker applications. With Docker Compose, we can define and run the entire stack using a single YAML file.
- Elasticsearch: A highly scalable open-source full-text search and analytics engine. It is used to store, search, and analyze the log data generated by the log generator.
- Logstash: An open-source server-side data processing pipeline that ingests data from multiple sources, transforms it, and then sends it to your preferred "stash". In this project, Logstash reads log data from the log generator and sends it to Elasticsearch.
- Kibana: An open-source data visualization and exploration tool used for reviewing the log data stored in Elasticsearch. It provides a user-friendly web interface to create visualizations, dashboards, and more.
- Python: Used to write the custom log generator script. The script simulates log data generation and writes the logs to a file, which is then read by Logstash.
This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.
- Docker and Docker Compose installed on your machine.
- Clone the Repository:
git clone <repository_url> cd <repository_directory> docker-compose up --build