Skip to content

A local LLM chat application and can also uses your Journal as a knowledge base for context-aware conversations.

License

Notifications You must be signed in to change notification settings

alphastar-avi/LLMChat

Repository files navigation

LLMChat

A local LLM chat application and can also uses your Journal as a knowledge base for context-aware conversations.

LLMChat.mp4

Setup

  1. Install Ollama:
# macOS/Linux
brew install ollama
  1. Install a model:
ollama pull deepseek-r1:1.5b
  1. Install application dependencies:
python -m venv venv
source venv/bin/activate  
pip install -r requirements.txt
  1. Run the application:
python app.py

Usage

  1. Enter your model name in the application (e.g., deepseek-r1:1.5b)
  2. Add entries to your diary through the memory interface
  3. Chat with the AI using your diary as context

Technical Stack

  • Backend: Flask, LangChain, Ollama
  • Frontend: HTML, CSS, JavaScript, Tailwind CSS
  • Database: ChromaDB (Vector Store)

About

A local LLM chat application and can also uses your Journal as a knowledge base for context-aware conversations.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published