Open
Description
Is there an existing issue for the same bug?
- I have checked the existing issues.
RAGFlow workspace code commit ID
RAGFlow image version
v0.16.0-86-g38e551cc full
Other environment information
Ubuntu 22.04
Actual behavior
In the chat, I used the DeepSeek-R1 model and set the maximum output token limit to 10,000. However, when I asked a question, the answer was truncated at 618 characters, which is significantly fewer than 10,000 tokens.


Expected behavior
The expected answer output should not be truncated.
Steps to reproduce
1. configure model deepsee-r1 (Tian yi yun).
2. set max token 10000.
3. chat.
Additional information
No response