OllaPy is a sleek, self-contained web interface for interacting with local language models through Ollama. It provides a private and secure environment for your conversations, running entirely on your local machine. With a lightweight Python/Flask backend and a vanilla JavaScript frontend, OllaPy is designed for simplicity, privacy, and hackability.
- 💻 Desktop Application: OllaPy can now be run as a standalone desktop application using Electron, providing a native-like experience.
- 🔒 Absolute Privacy: All interactions happen on your local machine. No data is ever sent to third-party servers.
- 💾 Chat History: Conversations are automatically saved as JSON files in a
logsdirectory, allowing for easy access, backup, and management. - 🤖 Model Selection: Seamlessly switch between different Ollama models using a dropdown menu in the user interface.
- ✍️ Markdown Rendering: Enjoy beautifully formatted AI responses, including lists, tables, and code blocks.
- 💨 Real-Time Streaming: Experience interactive conversations with the AI's responses streamed in real-time.
- 📊 Token Counter: Monitor the context size of your conversations with a progress bar and token counter.
- ⏱️ Performance Metrics: Track the generation time for each AI response.
- 📈 System Monitoring: Real-time display of CPU and RAM usage within the Electron application.
- 🚀 Integrated Backend Startup: The Electron application now automatically manages the startup and shutdown of the Flask backend server.
- 🛑 Cancel Responses: Interrupt the AI's response generation at any time.
- 🛡️ Built-in Security: Client-side HTML sanitization using
DOMPurifyto prevent XSS attacks.
OllaPy's architecture is composed of two main components:
server.py(Backend): A lightweight Flask server that serves the mainindex.htmlfile and provides a REST API for managing chat logs (CRUD operations).index.html& JavaScript modules (Frontend): A vanilla JavaScript application that communicates directly with the Ollama server for AI interactions and with the Flask server for chat history management.
graph TD
A[User 👨💻] -- Interacts with --> B[Browser: index.html]
B -- "API Requests (Save/Load)" --> C[Backend Flask: server.py]
C -- "Reads/Writes" --> D["Log Files<br/>(logs/*.json) 📝"]
B -- "LLM Requests (Prompt)" --> E[Ollama Server 🧠]
E -- "Streaming Response" --> B
- Python 3 and
pip. - Ollama installed and running.
- At least one Ollama model downloaded (e.g.,
ollama pull gemma3).
-
Clone the repository:
git clone https://github.com/your-username/ollapy.git cd ollapy -
Run the start script:
The
start.shscript automates the setup process, including installing dependencies and launching the necessary servers../start.sh
This will:
- Install the required Python packages from
requirements.txt. - Start the Ollama server in the background.
- Start the Flask backend server.
- Open the OllaPy web interface in your default browser.
- Install the required Python packages from
-
Start chatting!
You can now interact with your local language models through the OllaPy interface.
OllaPy can also be run as a desktop application using Electron, providing a native-like experience.
-
Install Node.js dependencies:
npm install
-
Start the Electron application:
npm start
This will launch the OllaPy desktop application. Ensure your Ollama server is running in the background.
Note: The Electron application currently expects the Flask backend to be running separately. You can start it using
./start.sh(which also starts Ollama) or by manually runningpython server.pyin a separate terminal.
To customize the default settings, such as the default Ollama model or the Ollama server URL, you can modify the js/config.js file:
// js/config.js
export const DEFAULT_OLLAMA_MODEL = "gemma3"; // Set your preferred default model
export const OLLAMA_BASE_URL = "http://localhost:11434"; // Modify if your Ollama server runs on a different URLThe Flask backend exposes a simple REST API for managing chat logs:
GET /api/chats: Retrieves a list of all saved chats.GET /api/chats/<chat_id>: Retrieves the content of a specific chat.POST /api/chats: Saves or updates a chat.DELETE /api/chats/<chat_id>: Deletes a specific chat.
Contributions are welcome! If you have any ideas for improvements or new features, feel free to fork the repository, make your changes, and submit a pull request.
Some ideas for contributions include:
- Implementing a more advanced chat history with folders and search functionality.
- Adding support for different themes (e.g., light, dark, cyberpunk).
- Adding the ability to export chats as Markdown or PDF files.
