Skip to content

Minimalist UI to query any Ollama model. The user also has an option to enter a system prompt for better answers to queries. Additionally, the user can change the llm model, the temperature, etc.

KashyapTan/LLM-Query-UI

Repository files navigation

UI to query any Ollama model using local models

Prerequisites

Download Ollama: ollama.com/download

Download a Model: ollama.com/library

After downloading, run the model locally:

  ollama run [model_name_here]

Note: This project uses llama3.1. If you want to use a different model, you need to change the model in server.py

Set Up

Clone the project

  git clone https://github.com/KashyapTan/LLM-Query-UI.git

Install React dependencies

  npm install

Install Server requirements

  pip install -r requirements.txt

Start local host

  npm run dev

Start python server

  python server.py

About

Minimalist UI to query any Ollama model. The user also has an option to enter a system prompt for better answers to queries. Additionally, the user can change the llm model, the temperature, etc.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published