-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAI /completions route fails #30
Comments
I've the same behaviour with the OpenAI SDK. With the following Python script :
|
Hi. I have the same behavior on Linux/x86 |
Still the same behaviour with 2.1.0. |
It looks like "stop" is required. Try passing in
llamafile/llama.cpp/server/server.cpp Lines 2402 to 2408 in 73ee0b1
|
Mardak
added a commit
to Mardak/llamafile
that referenced
this issue
Dec 1, 2023
jart
pushed a commit
that referenced
this issue
Dec 2, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello,
Thank you the new release including the OpenAI routes but after a try, it always returns the following error using the raw request of the README.md:
I'm on M1 with
mistral-7b-instruct-v0.1.Q5_K_M.gguf
andllama-2-7b-chat.Q5_K_S.gguf
models.I didn't try with the OpenAI SDK.
The text was updated successfully, but these errors were encountered: