Skip to content

File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__ pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCppEmbeddings #461

Closed
@sandyrs9421

Description

i am seeing below error when i run the ingest.py. any thoughts on how i can resolve it ? kindly advise

Error -
error loading model: this format is no longer supported (see ggerganov/llama.cpp#1305)
llama_init_from_file: failed to load model
Traceback (most recent call last):
File "/Users/FBT/Desktop/Projects/privategpt/privateGPT/ingest.py", line 39, in
main()
File "/Users/FBT/Desktop/Projects/privategpt/privateGPT/ingest.py", line 30, in main
llama = LlamaCppEmbeddings(model_path="./models/ggml-model-q4_0.bin")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCppEmbeddings
root
Could not load Llama model from path: ./models/ggml-model-q4_0.bin. Received error (type=value_error)

My ENV file -
PERSIST_DIRECTORY=db
MODEL_TYPE=GPT4All
MODEL_PATH=models/ggml-gpt4all-j-v1.3-groovy.bin
EMBEDDINGS_MODEL_NAME=/Users/FBT/Desktop/Projects/privategpt/privateGPT/models/ggml-model-q4_0.bin
MODEL_N_CTX=1000

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingprimordialRelated to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions