Skip to content

[Question]: answer won't follow the output format specified in the system prompt #5952

Open
@tonyzzzzz

Description

@tonyzzzzz

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (Language Policy).
  • Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
  • Please do not modify this template :) and fill in all the required fields.

Describe your problem

we have a ragflow + ollama (deepseek) system deployed on a Linux box. We tried to fixed the answer format by using system prompt. However, the answer from RAGFlow won't keep the format. As a comparasion, ollama with the same system prompt (via open-UI) doesn't have this issue, i.e., all the anwers follow the format specified by the system prompt. The model configurations are the same for both (e.g., temp = 0.1, top-k = 0.3 etc.).

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions