-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
I'm having issues with streaming responses for a research agent using latest Gemini models. This works fine:
result = await sdk.research_agent.run(question, deps=context) print(result.data)
but this fails:
async with sdk.research_agent.run_stream(question, deps=context) as stream: async for chunk in stream.stream(): print(chunk)
This results in:
`---------------------------------------------------------------------------
ModelHTTPError Traceback (most recent call last)
Cell In[7], line 1
----> 1 async with sdk.research_agent.run_stream(question, deps=context) as stream:
2 async for chunk in stream.stream():
3 print(chunk)
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/contextlib.py:210, in _AsyncGeneratorContextManager.aenter(self)
208 del self.args, self.kwds, self.func
209 try:
--> 210 return await anext(self.gen)
211 except StopAsyncIteration:
212 raise RuntimeError("generator didn't yield") from None
File ~/venvs/nxd/lib/python3.12/site-packages/pydantic_ai/agent.py:690, in Agent.run_stream(self, user_prompt, result_type, message_history, model, deps, model_settings, usage_limits, usage, infer_name)
688 if self.is_model_request_node(node):
689 graph_ctx = agent_run.ctx
--> 690 async with node._stream(graph_ctx) as streamed_response: # pyright: ignore[reportPrivateUsage]
692 async def stream_to_final(
693 s: models.StreamedResponse,
694 ) -> FinalResult[models.StreamedResponse] | None:
695 result_schema = graph_ctx.deps.result_schema
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/contextlib.py:210, in _AsyncGeneratorContextManager.aenter(self)
208 del self.args, self.kwds, self.func
209 try:
--> 210 return await anext(self.gen)
211 except StopAsyncIteration:
212 raise RuntimeError("generator didn't yield") from None
File ~/venvs/nxd/lib/python3.12/site-packages/pydantic_ai/_agent_graph.py:293, in ModelRequestNode._stream(self, ctx)
290 assert not self._did_stream, 'stream() should only be called once per node'
292 model_settings, model_request_parameters = await self._prepare_request(ctx)
--> 293 async with ctx.deps.model.request_stream(
294 ctx.state.message_history, model_settings, model_request_parameters
295 ) as streamed_response:
296 self._did_stream = True
297 ctx.state.usage.incr(_usage.Usage(), requests=1)
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/contextlib.py:210, in _AsyncGeneratorContextManager.aenter(self)
208 del self.args, self.kwds, self.func
209 try:
--> 210 return await anext(self.gen)
211 except StopAsyncIteration:
212 raise RuntimeError("generator didn't yield") from None
File ~/venvs/nxd/lib/python3.12/site-packages/pydantic_ai/models/gemini.py:150, in GeminiModel.request_stream(self, messages, model_settings, model_request_parameters)
142 @asynccontextmanager
143 async def request_stream(
144 self,
(...) 147 model_request_parameters: ModelRequestParameters,
148 ) -> AsyncIterator[StreamedResponse]:
149 check_allow_model_requests()
--> 150 async with self._make_request(
151 messages, True, cast(GeminiModelSettings, model_settings or {}), model_request_parameters
152 ) as http_response:
153 yield await self._process_streamed_response(http_response)
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/contextlib.py:210, in _AsyncGeneratorContextManager.aenter(self)
208 del self.args, self.kwds, self.func
209 try:
--> 210 return await anext(self.gen)
211 except StopAsyncIteration:
212 raise RuntimeError("generator didn't yield") from None
File ~/venvs/nxd/lib/python3.12/site-packages/pydantic_ai/models/gemini.py:242, in GeminiModel._make_request(self, messages, streamed, model_settings, model_request_parameters)
240 await r.aread()
241 if status_code >= 400:
--> 242 raise ModelHTTPError(status_code=status_code, model_name=self.model_name, body=r.text)
243 raise UnexpectedModelBehavior(f'Unexpected response from gemini {status_code}', r.text)
244 yield r
ModelHTTPError: status_code: 400, model_name: gemini-1.5-pro, body: [{
"error": {
"code": 400,
"message": "Invalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[0].value': Cannot find field.\nInvalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[1].value': Cannot find field.\nInvalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[2].value': Cannot find field.\nInvalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[3].value': Cannot find field.\nInvalid JSON payload received. Unknown name "additionalProperties" at 'tools.function_declarations[1].parameters': Cannot find field.",
"status": "INVALID_ARGUMENT",
"details": [
{
"@type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"field": "tools.function_declarations[0].parameters.properties[0].value",
"description": "Invalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[0].value': Cannot find field."
},
{
"field": "tools.function_declarations[0].parameters.properties[1].value",
"description": "Invalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[1].value': Cannot find field."
},
{
"field": "tools.function_declarations[0].parameters.properties[2].value",
"description": "Invalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[2].value': Cannot find field."
},
{
"field": "tools.function_declarations[0].parameters.properties[3].value",
"description": "Invalid JSON payload received. Unknown name "default" at 'tools.function_declarations[0].parameters.properties[3].value': Cannot find field."
},
{
"field": "tools.function_declarations[1].parameters",
"description": "Invalid JSON payload received. Unknown name "additionalProperties" at 'tools.function_declarations[1].parameters': Cannot find field."
}
]
}
]
}
}
]`
The error persists even if I use the Agent iter for streaming by executing each node. Surprisingly this works fine with Open AI models. It appears as if the tool function definitions are not getting passed properly.
For reference, I'm passing the tools to the agent like this:
Agent( model_obj=get_model('HP_LC'), result_type=str, deps_type=ChatContext, tools=[Tool(research_tool, takes_ctx=True), Tool(references_tool, takes_ctx=True)] )
For now I have to stick to non-streaming mode of operation, but that results in a poor user experience.
Example Code
I just changes one of the samples in the documentation by adding a function parameter in the tool call. It immediately fails. Without any function parameters in the tool, it works fine.
import random
from pydantic_ai import Agent, RunContext, Tool
def roll_die(how_many_times: int = 1) -> str:
"""
Roll a six-sided die how_many_times and return the average result.
"""
return str(random.randint(1, 6))
def get_player_name(ctx: RunContext[str]) -> str:
"""
Get the player's name.
"""
return ctx.deps
agent = Agent(
'google-gla:gemini-1.5-pro',
deps_type=str,
system_prompt=(
"You're a dice game, you should roll the die and see if the number "
"you get back matches the user's guess. If so, tell them they're a winner. "
"Use the player's name in the response."
),
tools=[Tool(roll_die, takes_ctx=False), Tool(get_player_name, takes_ctx=True)]
)
async with agent.run_stream('My guess is 4', deps='Anne') as stream:
async for chunk in stream.stream():
print(chunk)
Python, Pydantic AI & LLM client version
Python 3.12.6
pydantic-ai-slim==0.0.55
google-genai==1.10.0