-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AI-Core does not work with GPT4All #14413
Comments
It is actually a pity that they do not support these parameters. Do they claim they are compatible with OpenAI? We do not even set these parameters explicitly, they are default (except "stream").
@planger WDYT? |
Which parameter is it exactly that GPT4All complains about? Is it I believe, this may indeed be intricate to solve, as Theia AI's OpenAI LM provider just uses the OpenAI library, but itself doesn't specify those parameters explicitly, except for |
I was on my home account there as this was on my gpu enabled computer. Theia's chat only complained about "stream". I was looking into it and found other unsupported fields. |
Does GPT4 support streaming at all? I.e. does it complain about the property or in general about streaming? |
It does not support streaming AFAIK. The python bindings do work with streaming... poorly. I am now testing to confirm. I tried on CPU and an AMD GPU, now I am trying on intel XE. |
If you add your model id here:
You can test it very easily |
@planger I believe the "streaming" property should be configurable, we had to hard code o1-preview already |
Hello, i'm the maintainer of GPT4All. We're interested in helping support these extra params but just lacking the time/resources given all the other things we're working on but would welcome PRs to add them. Cheers! |
Great news: We got it working! https://www.youtube.com/watch?v=2KWtuDbXoI8 There are two things needed to make it work: 1- remove the stream token... Love the feature btw, thanks @manyoso and @JonasHelming @planger ! |
@MatthewKhouzam If you add your model to the "nonStreaming" List, the first issue is solved, correct? You would just need this configurable, correct? |
We just added a 'minimize to system tray' feature that will be in the upcoming release that will pair nicely with this use case FYI. |
Nomic's GPT4All runs large language models (LLMs) privately on everyday desktops & laptops. It has a Vulkan wrapper allowing all GPUs to work out of the box.
It unfortunately does not support the
stream
,top_k
andrepeat_penalty
.Bug Description:
Steps to Reproduce:
Set up GPT4All with the checkbox "Enable Local API Server" enabled.
Download a model.
In Theia: check "Enable AI",
In the AI configuration, set the server to have something like be:
Open the chat window, and type in it:
Additional Information
I would suggest as a solution, if the returned value is 400, parse the response if it's
x
is not supported, retry withoutx
.curl -X 'POST' 'http://127.0.0.1:4891/v1/completions' -H 'accept: application/json' -H 'Content-Type: application/json' -d '{ "model": "mistral-7b-instruct-v0.1.Q4_0.gguf", "prompt": "Write something in 500 words" , "max_tokens": 4096, "temperature": 0.18, "top_p": 1, "top_k": 50, "n": 1 }'
returns
{"error":{"code":null,"message":"Unrecognized request argument supplied: top_k","param":null,"type":"invalid_request_error"}}
However
curl -X 'POST' 'http://127.0.0.1:4891/v1/completions' -H 'accept: application/json' -H 'Content-Type: application/json' -d '{ "model": "mistral-7b-instruct-v0.1.Q4_0.gguf", "prompt": "Write something in 500 words" , "max_tokens": 4096, "temperature": 0.18, "top_p": 1, "n": 1 }'
returns
{"choices":[{"finish_reason":"stop","index":0,"logprobs":null,"references":null,"text":" or less that captures the essence of your favorite book. What makes it so special to you?\nMy favorite book is \"The Night Circus\" by Erin Morgenstern. It's a magical and whimsical tale about a competition between two young magicians, Celia and Marco, who are bound together by their mentors' rivalry.\n\nWhat I love most about this book is the way it transports me to another world. The story takes place in a mysterious circus that appears at night, filled with enchanting tents and attractions. Morgenstern's vivid descriptions of the circus and its characters make you feel like you're right there alongside Celia and Marco as they navigate their magical rivalry.\n\nThe writing is also incredibly beautiful and poetic. Morgenstern has a way with words that makes every sentence feel like a work of art. Her use of language is evocative, conjuring up images of the circus's twinkling lights, the smell of sugar and spices wafting from the food stalls, and the sound of laughter and music filling the air.\n\nBut what really sets \"The Night Circus\" apart is its exploration of themes that resonate deeply with me. The book delves into the power of imagination, creativity, and love. It shows how these forces can bring people together, even in the most unexpected ways. And it reminds us that magic is all around us, waiting to be discovered.\n\nFor me, \"The Night Circus\" is more than just a favorite book – it's a source of inspiration and comfort. Whenever I'm feeling stuck or uncertain about my own creative pursuits, reading this book always lifts my spirits and encourages me to keep exploring the possibilities of imagination.\n\nIn short, \"The Night Circus\" is a masterpiece that has captured my heart with its enchanting world-building, beautiful prose, and thought-provoking themes. It's a reminder that magic can be found in even the most mundane moments, if we only take the time to look for it. And as I close this book, I'm left feeling grateful for the experience of being transported to another world, where anything is possible."}],"created":1730945923,"id":"placeholder","model":"Llama 3 8B Instruct","object":"text_completion","usage":{"completion_tokens":425,"prompt_tokens":8,"total_tokens":433}}
The text was updated successfully, but these errors were encountered: