Skip to content

[Bug]: The length of the output does not match the setting“max_token" #5797

Closed
@DankerMu

Description

@DankerMu

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

space

RAGFlow image version

v0.17.0-slim

Other environment information

ubuntu,docker deploy

Actual behavior

Without adding a knowledge base, the maximum output length of the novel generated by the LLM is tested to be just over 2300 characters.”

Image

Expected behavior

No response

Steps to reproduce

1. Create an assistant,
2. Turn off the knowledge base,
3. Modify to a general LLM assistant prompt, 
4. Start a new chat: Write a 30,000-word novel.

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐞 bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions