A lightweight open-source LLM agent with RAG (Retrieval-Augmented Generation) capabilities, able to query both its internal model knowledge and a connected MySQL database. It allows users to ask natural-language questions that the agent translates into SQL queries and executes automatically.
This project is intended for learning purposes only. It is not designed or guaranteed for production use. Any deployment or usage is done at your own risk. The author assumes no responsibility for data loss, security issues, or system failures resulting from the use of this project.
- 💡 RAG-powered reasoning — Combines generative LLM reasoning with real database retrieval.
- 🗄️ MySQL integration — Directly connects to your MySQL data.
- 🐳 Dockerized setup — One-command deployment with Docker Compose.
- 🌐 HTTP API — Simple API endpoint for natural-language queries.
- 🐳 Docker
- 💻 Basic terminal usage
Place your SQL dump file in ./db/db.sql.
This file will be imported into the MySQL container during startup.
Store your MySQL root password in ./db/passwd.txt. This value will be read by the container during startup.
Make sure that ./db/passwd.txt contains only the password and no trailing newline. Many editors automatically append a newline at the end of the file, which will cause MySQL authentication to fail.
To safely create the file without adding a newline, run:
echo -n "your_password_here" > db/passwd.txtIf you wish to change the default Ollama model (gpt-oss:20b), set the OLLAMA_MODEL environment variable:
export OLLAMA_MODEL=model_identifierRun the following command from the project root:
docker compose upThis will start both the LLM agent and the MySQL database, automatically initializing the schema and data from ./db/db.sql, and reading the MySQL password from ./db/passwd.txt.
⚠️ Note: If this is your first run or the required Ollama models are not yet downloaded, please wait until the model download completes before sending any queries to the agent.
After startup, query the agent via HTTP:
curl -G --data-urlencode "q=Your query" http://localhost:8000The agent will process your question, generate SQL queries, run it on the MySQL database, and return the result in natural language.
.
├── db/
│ ├── db.sql # Your MySQL dump file
│ └── passwd.txt # Your MySQL password
├── ollama/
│ ├── entrypoint.sh # Pulls the Ollama model before launching the agent
│ └── ollama/ # Ollama models assets
│ └── ...
├── src/
│ └── sqlagent/ # Source code of the agent
│ └── ...
├── tests/ # Unit tests
│ └── ...
├── compose.yaml # Docker configuration
├── Dockerfile # Docker build instructions for the LLM agent
├── LICENSE # License file
├── README.md
├── pyproject.toml
└── uv.lock # Lockfile generated by uv to ensure deterministic dependency versions
This project is distributed under the MIT License.