Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error Launching App with Local LLM on WSL Using Ollama #20

Open
Hammad-Mir opened this issue Nov 2, 2024 · 5 comments
Open

Error Launching App with Local LLM on WSL Using Ollama #20

Hammad-Mir opened this issue Nov 2, 2024 · 5 comments

Comments

@Hammad-Mir
Copy link

I attempted to set up using a local LLM in WSL using Ollama. After entering the configuration details for the local model and restarting the app, I encountered the following error. Now, the app won’t open, and I’m unable to make any further changes.

Traceback (most recent call last):
File "app\app.py", line 74, in
File "app\app.py", line 39, in init
File "C:\Users\hamma\AppData\Local\Temp_MEI1082882\core.py", line 22, in init
self.llm = LLM()
^^^^^
File "C:\Users\hamma\AppData\Local\Temp_MEI1082882\llm.py", line 59, in init
self.model = ModelFactory.create_model(self.model_name, base_url, api_key, context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\hamma\AppData\Local\Temp_MEI1082882\models\factory.py", line 13, in create_model
raise ValueError(f'Unsupported model type {model_name}. Create entry in app/models/')
ValueError: Unsupported model type llama3.1. Create entry in app/models/

@ludvigpujsek
Copy link

I am encountering the same issue trying to run llava through ollama. I'd like to add that deleting the TEMP_MEI[...]-directory in appdata (and then overwriting the executable with a new one, just in case) for some reason doesn't fix the issue. I've looked througth %appdata%/Local, LocalLow and Roaming, and none of them seem to contain any trace that some sort of persistent configuration might have been left by the program.

@ludvigpujsek
Copy link

ludvigpujsek commented Nov 3, 2024

After digging through the temporary file generated and following the python functions back to where the settings are read:
The file that causes the program to no longer launch properly is a settings.json saved in %userprofile%/.open-interface or (most probably) c:/Users/<your_user>/.open-interface. Remember to replace "/" with "\" if copy pasting path into file explorer in Windows. Deleting settings.json will make the program reset the selected model and thus allow it to launch.

@Hammad-Mir
Copy link
Author

Thanks! This resolved the app running issue. Any success with running the app with Ollama?

@ludvigpujsek
Copy link

Thanks! This resolved the app running issue. Any success with running the app with Ollama?

No such luck - I have an exam in 2 days, so I probably won't be able to get back to this 'til after then.

@AmberSahdev
Copy link
Owner

Ope fixed it for MacOS M Series in 0.6.1. Will compile the Linux/Windows binaries later in the weekend I don't have my desktop right now.

Thanks for the analysis @ludvigpujsek appreciate it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants