-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error Launching App with Local LLM on WSL Using Ollama #20
Comments
I am encountering the same issue trying to run llava through ollama. I'd like to add that deleting the TEMP_MEI[...]-directory in appdata (and then overwriting the executable with a new one, just in case) for some reason doesn't fix the issue. I've looked througth %appdata%/Local, LocalLow and Roaming, and none of them seem to contain any trace that some sort of persistent configuration might have been left by the program. |
After digging through the temporary file generated and following the python functions back to where the settings are read: |
Thanks! This resolved the app running issue. Any success with running the app with Ollama? |
No such luck - I have an exam in 2 days, so I probably won't be able to get back to this 'til after then. |
Ope fixed it for MacOS M Series in 0.6.1. Will compile the Linux/Windows binaries later in the weekend I don't have my desktop right now. Thanks for the analysis @ludvigpujsek appreciate it! |
I attempted to set up using a local LLM in WSL using Ollama. After entering the configuration details for the local model and restarting the app, I encountered the following error. Now, the app won’t open, and I’m unable to make any further changes.
Traceback (most recent call last):
File "app\app.py", line 74, in
File "app\app.py", line 39, in init
File "C:\Users\hamma\AppData\Local\Temp_MEI1082882\core.py", line 22, in init
self.llm = LLM()
^^^^^
File "C:\Users\hamma\AppData\Local\Temp_MEI1082882\llm.py", line 59, in init
self.model = ModelFactory.create_model(self.model_name, base_url, api_key, context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\hamma\AppData\Local\Temp_MEI1082882\models\factory.py", line 13, in create_model
raise ValueError(f'Unsupported model type {model_name}. Create entry in app/models/')
ValueError: Unsupported model type llama3.1. Create entry in app/models/
The text was updated successfully, but these errors were encountered: