A local LLM chat application and can also uses your Journal as a knowledge base for context-aware conversations.
LLMChat.mp4
- Install Ollama:
# macOS/Linux
brew install ollama
- Install a model:
ollama pull deepseek-r1:1.5b
- Install application dependencies:
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
- Run the application:
python app.py
- Enter your model name in the application (e.g.,
deepseek-r1:1.5b
) - Add entries to your diary through the memory interface
- Chat with the AI using your diary as context
- Backend: Flask, LangChain, Ollama
- Frontend: HTML, CSS, JavaScript, Tailwind CSS
- Database: ChromaDB (Vector Store)