Skip to content

[Bug]:v0.17.1 Add xinference model error Connection error  #5982

Closed
@ran411285752

Description

@ran411285752

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (Language Policy).
  • Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
  • Please do not modify this template :) and fill in all the required fields.

RAGFlow workspace code commit ID

d447392

RAGFlow image version

d447392(v0.17.1)

Other environment information

Actual behavior

使用v0.17.0,添加xinference运行的本地模型,是没有问题的
但是使用v0.17.1,添加xinference运行的本地模型,同样的操作和配置,会报错Connection error

Expected behavior

使用v0.17.1,可以正常添加xinference运行的本地模型

Steps to reproduce

1、使用v0.17.1的docker镜像运行ragflow
2、添加llm模型,使用xinference供应商
3、只要是v0.17.1添加,无论是什么类型的模型,都会报错Connection error.

Additional information

报错日志:
2025-03-12 17:41:56,913 INFO 19 Retrying request to /chat/completions in 0.782334 seconds
2025-03-12 17:41:57,714 INFO 19 Retrying request to /chat/completions in 1.503957 seconds
2025-03-12 17:41:59,230 ERROR 19
Fail to access model(qwen2.5-instruct).ERROR: Connection error.
NoneType: None

2025-03-12 17:58:43,702 INFO 19 Retrying request to /embeddings in 0.873221 seconds
2025-03-12 17:58:44,579 INFO 19 Retrying request to /embeddings in 1.891999 seconds
2025-03-12 17:58:46,477 ERROR 19
Fail to access embedding model(Conan-embedding-v1).Connection error.
NoneType: None

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐞 bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions