Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatOllama won't use with_fallbacks when I use astream_events. #24816

Open
5 tasks done
gbaian10 opened this issue Jul 30, 2024 · 1 comment
Open
5 tasks done

ChatOllama won't use with_fallbacks when I use astream_events. #24816

gbaian10 opened this issue Jul 30, 2024 · 1 comment
Assignees
Labels
03 enhancement Enhancement of existing functionality 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@gbaian10
Copy link
Contributor

gbaian10 commented Jul 30, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import asyncio
from enum import Enum

from dotenv import load_dotenv
from langchain_core.output_parsers import PydanticToolsParser
from langchain_ollama import ChatOllama
from langchain_openai import ChatOpenAI
from pydantic.v1 import BaseModel, Field

load_dotenv()


class DateEnum(str, Enum):
    first_day = "2024-10-10 10:00:00"
    second_day = "2024-10-11 14:00:00"
    third_day = "2024-10-12 14:00:00"


class SelectItem(BaseModel):
    """Confirm the user's choice based on the user's answer."""

    item: DateEnum = Field(..., description="Select a date based on user responses")


tools = [SelectItem]
ollama_llm = ChatOllama(model="llama3.1:8b").bind_tools(tools)
openai_llm = ChatOpenAI(model="gpt-4o-mini").bind_tools(tools)
parser = PydanticToolsParser(tools=tools)

chain = ollama_llm | parser
fall_back_chain = openai_llm | parser
with_fallback_chain = chain.with_fallbacks([fall_back_chain])

messages = [
    ("ai", f"Which day is most convenient for you in {list(DateEnum)}?"),
    ("human", "30"),
]


async def main():
    async for event in with_fallback_chain.astream_events(messages, version="v2"):
        print(event)  # It will not call fall_back
    print("-" * 20)
    print(await with_fallback_chain.ainvoke(messages))  # It will call fall_back


asyncio.run(main())

Error Message and Stack Trace (if applicable)

image

image

Description

ChatOllama won't use with_fallbacks when I use astream_events.
But it will use with_fallbacks when I use ainvoke.

My goal is to know which model produced this output.
When I connect PydanticToolsParser behind the model output, I can't seem to know who generated it. (it is hidden in the AIMessage of the intermediate model output).

So I wanted to take out the intermediate result from astream_events to determine who generated it.
Later I found that ChatOllama seems to be unable to call fall_back under astream_events? Is there a better solution?

System Info

langchain==0.2.11
langchain-core==0.2.24
langchain-ollama==0.1.0
langchain-openai==0.1.19

platform linux
python version = 3.10.12

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jul 30, 2024
@eyurtsev eyurtsev self-assigned this Jul 30, 2024
@baskaryan baskaryan added the 03 enhancement Enhancement of existing functionality label Aug 1, 2024
@ji24077
Copy link

ji24077 commented Oct 13, 2024

Hi, I am ji, 4th year student at UofT CS. I am working with @Ismail-Bashir @Ser0n-ath and @yashankxy who are also 4th year student at UofT CS. Can we take initiative to work on this issue and contribute to langchain?
Thanks 😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
03 enhancement Enhancement of existing functionality 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

4 participants