Skip to content

[Bug]: DeepSeek-R1 Output Truncated at 618 Characters Despite Max Token Setting of 10,000 in chat assistant #5140

Open
@qust5979

Description

@qust5979

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

38e551c

RAGFlow image version

v0.16.0-86-g38e551cc full

Other environment information

Ubuntu 22.04

Actual behavior

In the chat, I used the DeepSeek-R1 model and set the maximum output token limit to 10,000. However, when I asked a question, the answer was truncated at 618 characters, which is significantly fewer than 10,000 tokens.

Image Image

Expected behavior

The expected answer output should not be truncated.

Steps to reproduce

1. configure model deepsee-r1 (Tian yi yun).
2. set max token 10000.
3. chat.

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐞 bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions