Description
Describe your problem
Description
I am encountering an issue where adding an Ollama model as a service provider in RAGFLOW fails, despite being able to successfully call the same Ollama server using a direct curl command.
curl -X POST http://ihrtn.cminl.oa/ollama/api/generate -H "Content-Type: application/json" -d '{"model": "deepseek-r1:32b","prompt": "Please help me write a song","max_tokens": 1000,"stream": false,"decode": true}'
This returns a valid response, indicating the Ollama server and model (qwq:latest) are working correctly.
Failed RAGFLOW Configuration: In RAGFLOW, I attempt to add the model using the "Add Model Provider" feature with the following configuration:
post
{
"model_type": "chat",
"llm_name": "deepseek-r1:32b",
"api_base": "http://ihrtn.cminl.oa/ollama",
"api_key": "",
"max_tokens": 8192,
"llm_factory": "Ollama"
}
error:
hint : 102
Fail to access model(deepseek-r1:32b).ERROR: <title>404 Not Found</title>
404 Not Found
nginx
here is my ragflow-server log