Hi,
I noticed that when using the agent to call an OpenAI-compatible model service, the http request always includes "stream": false.
I’d like to ask whether this parameter can be configured or overridden. Currently, it seems that the agent doesn’t provide an option to enable streaming responses (i.e., "stream": true").