Skip to content

[Question]: RAGFLOW Fails to Add Ollama Model via add_llm() While Direct curl Call Succeeds #5841

Closed
@killeress

Description

@killeress

Describe your problem

Description
I am encountering an issue where adding an Ollama model as a service provider in RAGFLOW fails, despite being able to successfully call the same Ollama server using a direct curl command.

curl -X POST http://ihrtn.cminl.oa/ollama/api/generate -H "Content-Type: application/json" -d '{"model": "deepseek-r1:32b","prompt": "Please help me write a song","max_tokens": 1000,"stream": false,"decode": true}'

Image

This returns a valid response, indicating the Ollama server and model (qwq:latest) are working correctly.

Failed RAGFLOW Configuration: In RAGFLOW, I attempt to add the model using the "Add Model Provider" feature with the following configuration:

post
{
"model_type": "chat",
"llm_name": "deepseek-r1:32b",
"api_base": "http://ihrtn.cminl.oa/ollama",
"api_key": "",
"max_tokens": 8192,
"llm_factory": "Ollama"
}

error:
hint : 102
Fail to access model(deepseek-r1:32b).ERROR: <title>404 Not Found</title>

404 Not Found


nginx

Image

here is my ragflow-server log

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions