Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] github o1模型不支持流式,请求适配非流输出 #5145

Open
Zzz3626 opened this issue Dec 23, 2024 · 5 comments
Open

[Bug] github o1模型不支持流式,请求适配非流输出 #5145

Zzz3626 opened this issue Dec 23, 2024 · 5 comments
Labels
🐛 Bug Something isn't working | 缺陷

Comments

@Zzz3626
Copy link

Zzz3626 commented Dec 23, 2024

📦 部署环境

Docker

📌 软件版本

v1.36.46

💻 系统环境

Windows

🌐 浏览器

Edge

🐛 问题描述

{
"error": {
"code": "unsupported_value",
"type": "invalid_request_error",
"param": "stream",
"message": "Unsupported value: 'stream' does not support true with this model. Supported values are: false."
},
"endpoint": "https://models.inference.ai.azure.com/",
"provider": "github"
}

📷 复现步骤

No response

🚦 期望结果

No response

📝 补充信息

No response

@Zzz3626 Zzz3626 added the 🐛 Bug Something isn't working | 缺陷 label Dec 23, 2024
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Dec 23, 2024
@Zzz3626 Zzz3626 changed the title [Bug] github o1模型不支持流式,请求自动适配非流输出 [Bug] github o1模型不支持流式,请求适配非流输出 Dec 23, 2024
@lobehubbot
Copy link
Member

👀 @Zzz3626

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Copy link

dosubot bot commented Dec 23, 2024

I found a similar discussion regarding the issue of models not supporting streaming in the lobe-chat framework. The discussion is titled "调用chat的AI接口,传入stream为false,还是调用流式数据,或者返回报错" and it remains unsolved [1].

Regarding your issue, the error message indicates that the model you are using does not support streaming (stream: true). The lobe-chat framework handles such cases by returning an error message when a model does not support a requested feature like streaming. In your case, the error message specifies that the stream parameter is not supported with the value true for the model you are using [2].

To resolve this, you can set the stream parameter to false for models that do not support streaming. This adjustment should allow the model to process requests without attempting to use unsupported streaming features.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@jiajiaxd
Copy link

#5110 似乎已经适配过了,但是仍然需要手动自定义添加o1模型(如下图)
image
如果自定义添加o1的话,就会复现本issue出现的问题
版本:v1.37.1

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


#5110 seems to have been adapted, but you still need to manually add the o1 model (as shown below)
image
If o1 is added by customization, the problem of this issue will be reproduced.
Version: v1.37.1

@RS-Imagine
Copy link

RS-Imagine commented Dec 23, 2024

同样能复现该报错。
另外,请考虑默认关闭GitHub提供商的模型获取,modelList: { showModelFetcher: true }, // I'm not sure if it is good to show the model fetcher, as remote list is not complete.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

4 participants