Skip to content

MOHAPY24/Omlols

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Omlols (OhMyLlamaOnLifeSupport)

An Ollama custom client for prompt-engineering and modularity

Built and used personally by me (momo)!


🚀 What is Omlols?

Omlols — short for OhMyLlamaOnLifeSupport — is a lightweight, customizable Python client designed to interact with Ollama in a clean, modular, and prompt-engineer-friendly way.

This project is built for personal use, but it's structured well enough for others to extend, study, or borrow ideas from. The main goal is to:

  • Create a clean interactive interface for Ollama
  • Allow deep prompt control and prompt-engineering experiments
  • Provide modularity for plugins, memory, and external text sources
  • Keep everything simple, readable, and easy to modify

If you're into tinkering with LLM clients, RAG experiments, or custom workflows—Omills is for you.


📦 Features

  • 🧩 Modular plugin system
  • 🧠 Optional chat memory integration
  • 📚 External text source loading
  • ✨ Clean prompt construction
  • ⚙️ Built for experimenters and prompt engineers
  • 💻 Fully Python-based

📋 Prerequisites

Make sure you have the following installed:

  1. Python 3.9+
  2. Ollama (installed and running)
  3. tinyllama model downloaded in Ollama
  4. Install dependencies:
pip install -r requirements.txt

🔧 Installation & Usage

Clone the repo:

git clone https://github.com/MOHAPY24/Omlols
cd Omills

Run Omills:

chmod +x omlols
./omlols

Customize your plugins, memory behavior, and sources as you like — the code is intentionally easy to modify.


📄 License

Omlols (OhMyLlamaOnLifeSupport) is licensed under the Apache 2.0 License, with attribution provided in the NOTICE file. See the full license in LICENSE.

Releases

No releases published

Packages

No packages published