A simple local chat app using Flask and Ollama to run LLMs like llama3, command-r7b, and deepseek-r1. Switch models, chat in a web UI, and export history — all offline, no API keys needed.
python open-source flask ai webapp chat-application offline-chat llm local-llm ollama deepseek llama3 commmand-r7b
-
Updated
Aug 13, 2025 - CSS