Open
Description
Issue
- When dealing with fresh docker containers and assuming say the use of no volumes, there is no Model within the Ollama container.
- Rather than depending on the
MODEL
environment variable, we can allow for storage of what models are present in the container. - Note: We will need to account for the
--rm
case, or we could just not! Either way, default to removing the models when the container dies should be good as ollama always will pull the latest of a model. So it should be fine as long as it can pull from the "Ollama Model Library"
Solution
- Investigate how we could remove the
MODEL
environment variable and instead just have some kind of way to store what models exists and remove them upon collapse of the container. - Users should be able to set what model they can use in their chat. For now, we can have a default then they have to change it.
- This also means we have to have a way to automatically pull a model upon spin up of the application.
- IMPORTANT: Likely the NodeJS scripts follow a pattern of
discord
thenollama
container. Change that to the opposite so the ollama container can be ready prior to the bot. - If the user tries to pull nonsense, help them out by listing existing models.