Description
Please do a quick search on GitHub issues first, the feature you are about to request might have already been requested.
Expected Behavior
ChatModel.stream(xxx)
.flatMapSequential(f -> {
System.out.println(f.getResult().getOutput().getContent());
// Output reasoning_content
System.out.println(f.getResult().getOutput().getReasoningContent());
})
Current Behavior
ChatModel.stream(xxx)
.flatMapSequential(f -> {
// Output reasoning_content is not supported
System.out.println(f.getResult().getOutput().getContent());
})
Context
When launching LLM Q&A using ChatModel, the thinking response of deepseek-r1 will be output in reasoning_content. Currently, there is only content field in Message output. Unable to receive LLM's thinking, want to add related fields.