Skip to content

Latest commit

 

History

History
29 lines (19 loc) · 538 Bytes

README.md

File metadata and controls

29 lines (19 loc) · 538 Bytes

ollama-chat-ui

Allow users to chat with ML models using a web interface, using the Ollama Chat API. All the conversations are stored in a database locally in the browser using IndexedDB.

Run the project

  1. Create a .env file in the root of the project with the content of the .env.example file.

  2. Install the dependencies:

    npm install
  3. Build the container:

    docker-compose build
  4. Up the container:

    docker-compose up

Credits

Author: Roberto Silva Z.