-
I am trying to start a code completion server locally with the llama-cpp-python wrapper, see here. I am currently running this to start the server:
Now I realize I am using a downstream library to get to llama.cpp, but I figured that somebody here might be able to tell me what the assertion above even means. It might then be obvious what I'm doing wrong. Thanks in advance |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
This is likely a regression from #4657 (cc @postmasters) |
Beta Was this translation helpful? Give feedback.
This is likely a regression from #4657 (cc @postmasters)