Closed
Description
Is there an existing issue for the same bug?
- I have checked the existing issues.
RAGFlow workspace code commit ID
space
RAGFlow image version
v0.17.0-slim
Other environment information
ubuntu,docker deploy
Actual behavior
Without adding a knowledge base, the maximum output length of the novel generated by the LLM is tested to be just over 2300 characters.”
Expected behavior
No response
Steps to reproduce
1. Create an assistant,
2. Turn off the knowledge base,
3. Modify to a general LLM assistant prompt,
4. Start a new chat: Write a 30,000-word novel.
Additional information
No response