-
Notifications
You must be signed in to change notification settings - Fork 15.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypeError: langchain_core.language_models.chat_models.BaseChatModel.generate_prompt() got multiple values for keyword argument 'callbacks' #23379
Comments
Is there any workaround? |
I have created my own parser and it's actually working, suppose i have supervisor agent and 2 more agent, if wan't them to talk to supervisor then gives me An error occurred (ValidationException) when calling the Converse operation: A conversation must alternate between user and assistant roles. Make sure the conversation alternates between user and assistant roles and try again. |
hi, any resolution around this issue? |
Checked other resources
Example Code
from langchain_aws import ChatBedrock
from langchain_experimental.llms.anthropic_functions import AnthropicFunctions
from dotenv import load_dotenv
load_dotenv()
Initialize the LLM with the required parameters
llm = ChatBedrock(
model_id="anthropic.claude-3-haiku-20240307-v1:0",
model_kwargs={"temperature": 0.1},
region_name="us-east-1"
)
Initialize AnthropicFunctions with the LLM
base_model = AnthropicFunctions(llm=llm)
Define the function parameters for the model
functions = [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
},
},
"required": ["location"],
},
}
]
Bind the functions to the model without causing keyword conflicts
model = base_model.bind(
functions=functions,
function_call={"name": "get_current_weather"}
)
Invoke the model with the provided input
res = model.invoke("What's the weather in San Francisco?")
Extract and print the function call from the response
function_call = res.additional_kwargs.get("function_call")
print("function_call", function_call)
Error Message and Stack Trace (if applicable)
TypeError: langchain_core.language_models.chat_models.BaseChatModel.generate_prompt() got multiple values for keyword argument 'callbacks'
Description
I try to use function calling in Anthropic's models through Bedrock. Help me fix problem!!
System Info
I use latest version
The text was updated successfully, but these errors were encountered: