Skip to content

[Question]: Context Retention Issue in Sequential Questions with RAGFlow #5788

Closed
@Peterson047

Description

@Peterson047

Describe your problem

Description:
I've been using RAGFlow for a few weeks and am facing difficulties maintaining context in sequential questions.

Scenario:
I have a knowledge base containing sales data segmented by month. When I ask:

"How much did product X sell?"

The model correctly asks for clarification:

"Which month?"

However, when I specify the month in the next prompt, the model often loses track of the original question. Instead of maintaining context, it retrieves new chunks that are no longer relevant to the initial inquiry, leading to inconsistencies or hallucinations in the responses.

What I've tried:

  • Using metadata in the documents (which improved retrieval but didn't fully resolve the issue).
  • Explicit instructions for the model to maintain context across prompts.
  • Different approaches to question formulation.

Questions:

  1. How can I ensure that relevant chunks are aggregated until a complete answer is formed?
  2. What is an effective strategy to help the model retain context between prompts?
  3. The issue seems to stem from the fact that the model’s instructions do not directly apply to chunk retrieval. What would be the best way to mitigate this?

Any guidance would be greatly appreciated!

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions