ChatOllama won't use with_fallbacks when I use astream_events. #24816
Labels
03 enhancement
Enhancement of existing functionality
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
ChatOllama won't use with_fallbacks when I use astream_events.
But it will use with_fallbacks when I use ainvoke.
My goal is to know which model produced this output.
When I connect PydanticToolsParser behind the model output, I can't seem to know who generated it. (it is hidden in the AIMessage of the intermediate model output).
So I wanted to take out the intermediate result from astream_events to determine who generated it.
Later I found that ChatOllama seems to be unable to call fall_back under astream_events? Is there a better solution?
System Info
langchain==0.2.11
langchain-core==0.2.24
langchain-ollama==0.1.0
langchain-openai==0.1.19
platform linux
python version = 3.10.12
The text was updated successfully, but these errors were encountered: