Allow users to chat with ML models using a web interface, using the Ollama Chat API. All the conversations are stored in a database locally in the browser using IndexedDB.
-
Create a
.env
file in the root of the project with the content of the.env.example
file. -
Install the dependencies:
npm install
-
Build the container:
docker-compose build
-
Up the container:
docker-compose up
Author: Roberto Silva Z.