Skip to content

RobertoSilvaZ/ollama-chat-ui

Repository files navigation

ollama-chat-ui

Allow users to chat with ML models using a web interface, using the Ollama Chat API. All the conversations are stored in a database locally in the browser using IndexedDB.

Run the project

  1. Create a .env file in the root of the project with the content of the .env.example file.

  2. Install the dependencies:

    npm install
  3. Build the container:

    docker-compose build
  4. Up the container:

    docker-compose up

Credits

Author: Roberto Silva Z.