Skip to content

List Ollama Library Models Command #63

Open
@kevinthedang

Description

Issue

  • When dealing with fresh docker containers and assuming say the use of no volumes, there is no Model within the Ollama container.
  • Rather than depending on the MODEL environment variable, we can allow for storage of what models are present in the container.
  • Note: We will need to account for the --rm case, or we could just not! Either way, default to removing the models when the container dies should be good as ollama always will pull the latest of a model. So it should be fine as long as it can pull from the "Ollama Model Library"
    image

Solution

  • Investigate how we could remove the MODEL environment variable and instead just have some kind of way to store what models exists and remove them upon collapse of the container.
  • Users should be able to set what model they can use in their chat. For now, we can have a default then they have to change it.
  • This also means we have to have a way to automatically pull a model upon spin up of the application.
  • IMPORTANT: Likely the NodeJS scripts follow a pattern of discord then ollama container. Change that to the opposite so the ollama container can be ready prior to the bot.
  • If the user tries to pull nonsense, help them out by listing existing models.

Other Images

  • This is an image of a fresh discord and ollama container with no model set up.
    image

Metadata

Assignees

Labels

dependencyInvolves dependencies from npm or environmentsenhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions