Skip to content

Output of reasoning_content that supports deepseek-r1 is required #2283

Open
@stillmoon

Description

@stillmoon

Please do a quick search on GitHub issues first, the feature you are about to request might have already been requested.

Expected Behavior
ChatModel.stream(xxx)
.flatMapSequential(f -> {
System.out.println(f.getResult().getOutput().getContent());
// Output reasoning_content
System.out.println(f.getResult().getOutput().getReasoningContent());
})

Current Behavior

ChatModel.stream(xxx)
.flatMapSequential(f -> {
// Output reasoning_content is not supported
System.out.println(f.getResult().getOutput().getContent());
})

Context

When launching LLM Q&A using ChatModel, the thinking response of deepseek-r1 will be output in reasoning_content. Currently, there is only content field in Message output. Unable to receive LLM's thinking, want to add related fields.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions