Open
Description
openedon Jun 14, 2024
Can some one guide me to setup gpt4all from src code.
I have used this link
https://github.com/nomic-ai/gpt4all/blob/main/gpt4all-chat/build_and_run.md
I installed qt as in the description.
Build gpt4all-backend and the gpt4all-chat
cmake ../../gpt4all-backend/ -DLLMODEL_CUDA=OFF -DLLMODEL_KOMPUTE=OFF
So, if the executable is chat
that generates in gpt4all-chat/build/ bin when it execute I'm getting following
constructGlobalLlama: could not find Llama implementation for backend: kompute
constructGlobalLlama: could not find Llama implementation for backend: cuda
[Warning] (Thu Jun 13 23:20:20 2024): QQmlApplicationEngine failed to load component
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/main.qml:66:5: Type ChatView unavailable
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/qml/ChatView.qml:483:5: Type SettingsDialog unavailable
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/qml/SettingsDialog.qml:104:9: Type MySettingsStack unavailable
[Warning] (Thu Jun 13 23:20:20 2024): qrc:/gpt4all/qml/MySettingsStack.qml:67:30: IconLabel is not a type
[Warning] (Thu Jun 13 23:20:20 2024): QIODevice::read (QNetworkReplyHttpImpl): device not open
[Warning] (Thu Jun 13 23:20:20 2024): ERROR: Couldn't parse: "" "illegal value"
Can someone direct me to proper document. That how can I use bindings and how to set the model that we need to run.
Or if anyone can explain or help me to build this locally I can start work on document .
I need to know gpt4all-backend, gpt4all-binding, gpt4all-chat and gpt4all-traning build and run steps.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Metadata
Assignees
Labels
Improvements or additions to documentationImprovements or additions to documentation