Skip to content

When setting up the models, I was able to successfully configure the models from Silicon Flow (硅基流动) and Tencent Cloud. However, during the chat process, an error occurred #5251

Open
@cjqyxq

Description

@cjqyxq

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

RAGFlow image version

v0.16.0 slim

Other environment information

ubuntu22.04的虚拟机

Actual behavior

设置模型时可以设置成功硅基流动及腾讯云的大模型,但在聊天过程中报错
ERROR: LLM(deepseek-r1___OpenAI-API@OpenAI-) not found

Expected behavior

希望能正常调用api接口。前几天没升级v0.16.0的时候,使用的是v0.15.0的版本正常,但里面没有deepseek r1的模型,于是升级,升级后出现上面提示的错误。
使用自建的ollama的模型正常。

Steps to reproduce

聊天时出错提示:
ERROR: LLM(deepseek-r1___OpenAI-API@OpenAI-) not found

日志错误提示:
Traceback (most recent call last):   File "/ragflow/api/apps/conversation_app.py", line 230, in stream     for ans in chat(dia, msg, True, **req):   File "/ragflow/api/db/services/dialog_service.py", line 186, in chat     raise LookupError("LLM(%s) not found" % dialog.llm_id) LookupError: LLM(deepseek-r1___OpenAI-API@OpenAI-) not found

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐞 bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions