Omlols — short for OhMyLlamaOnLifeSupport — is a lightweight, customizable Python client designed to interact with Ollama in a clean, modular, and prompt-engineer-friendly way.
This project is built for personal use, but it's structured well enough for others to extend, study, or borrow ideas from. The main goal is to:
- Create a clean interactive interface for Ollama
- Allow deep prompt control and prompt-engineering experiments
- Provide modularity for plugins, memory, and external text sources
- Keep everything simple, readable, and easy to modify
If you're into tinkering with LLM clients, RAG experiments, or custom workflows—Omills is for you.
- 🧩 Modular plugin system
- 🧠 Optional chat memory integration
- 📚 External text source loading
- ✨ Clean prompt construction
- ⚙️ Built for experimenters and prompt engineers
- 💻 Fully Python-based
Make sure you have the following installed:
- Python 3.9+
- Ollama (installed and running)
- tinyllama model downloaded in Ollama
- Install dependencies:
pip install -r requirements.txtClone the repo:
git clone https://github.com/MOHAPY24/Omlols
cd OmillsRun Omills:
chmod +x omlols
./omlolsCustomize your plugins, memory behavior, and sources as you like — the code is intentionally easy to modify.
Omlols (OhMyLlamaOnLifeSupport) is licensed under the Apache 2.0 License, with attribution provided in the NOTICE file. See the full license in LICENSE.