Charla is a terminal based chat application that integrates with Ollama, a backend designed to serve language models. To use Charla, ensure that the ollama
server is running and at least one language model is installed.
Install Charla using pipx
:
pipx install charla
Launch the chat console by typing charla
in your terminal, or view all available command line options with charla -h
.
- Terminal-based chat system that supports context aware conversations using local language models.
- Chat sessions are saved as markdown files in the user's documents directory when ending a chat.
- Prompt history is saved and previously entered prompts are auto-suggested.
- Switch between single-line and multi-line input modes without interrupting the chat session.
- Store default user preferences in a settings file.
- Provide a system prompt for a chat session.
- Load content from local files and web pages to append to prompts.
{
"model": "llama3:latest",
"chats_path": "./chats",
"prompt_history": "./prompt-history.txt"
}
Run the command-line interface directly from the project source without installing the package:
python -m charla.cli
Installed models:
curl http://localhost:11434/api/tags
Model info:
curl http://localhost:11434/api/show -d '{"name": "phi3"}'
Charla is distributed under the terms of the MIT license.