-
Notifications
You must be signed in to change notification settings - Fork 94
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers
Milestone
Description
💡 Is your feature request related to a problem?
Currently, the app only has code for Llama 3.2 3B. Production build of the app ships with the same model. This is good enough for most users but there are several power users that can run larger models and so they should have the ability to swap and manage models easily, within the app.
✨ Describe the Solution
- backend changes: the Python backend will need to dynamically pull the model name from the settings that the user has set. user settings will also have to be saved somewhere.
- frontend changes: users will need an option in the Settings page where they can specify the model name. once they specify the name, ollama should be called, the model should be downloaded and the updated settings should be saved somewhere for the python backend to read the name of and use the model that has been selected.
- it would be ideal if we can show the different models that the user has downloaded and also download progress for models that they are currently downloading
🔄 Alternatives Considered
a quick fix could be to add a simple .conf file somewhere that lets the user set the name of the model and then retrieve the name from there at run-time. if the model has not been downloaded yet, the Python backend can first trigger Ollama to pull the model and then use that model.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers