Closed
Description
Is there an existing issue for the same feature request?
- I have checked the existing issues.
Is your feature request related to a problem?
Describe the feature you'd like
I love this project very much, Dify is a US AI platform, ragflow would be the best China AI platform I believe.
this is a convince feature, you know, every developer or user run ollama directly on localhost, like local Macbook or other kinds of laptop.
As the document of ragflow now - https://github.com/infiniflow/ragflow/blob/main/docs/guides/deploy_local_llm.mdx , it guide users to run ollama in Docker, could you consider to run the ollama directly on localhost computer.
Describe implementation you've considered
Metadata
Metadata
Assignees
Labels
No labels