-
Notifications
You must be signed in to change notification settings - Fork 204
Closed
Labels
bugBug related to the Logfire Python SDKBug related to the Logfire Python SDK
Description
Description
The Logfire UI nicely shows the tool call by an LLM for non-streamed responses
But for streamed responses the Assistant box is empty.
Code to reproduce
# Test logfire streamed responsea
from openai import Client
import logfire
logfire.configure()
logfire.instrument_openai()
client = Client()
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Create a Superhero named Monkey Boy."}],
stream=True,
stream_options={"include_usage": True},
tool_choice={"type": "function", "function": {"name": "return_superhero"}},
tools=[
{
"type": "function",
"function": {
"name": "return_superhero",
"parameters": {
"properties": {
"name": {"title": "Name", "type": "string"},
"age": {"title": "Age", "type": "integer"},
"power": {"title": "Power", "type": "string"},
"enemies": {
"items": {"type": "string"},
"title": "Enemies",
"type": "array",
},
},
"required": ["name", "age", "power", "enemies"],
"type": "object",
},
},
},
],
)
for chunk in response:
print(chunk)Related (closed) issue: #54
Python, Logfire & OS Versions, related packages (not required)
logfire="0.50.1"
platform="macOS-15.0.1-arm64-arm-64bit"
python="3.10.12 (main, Jul 15 2023, 09:54:16) [Clang 14.0.3
(clang-1403.0.22.14.1)]"
[related_packages]
requests="2.32.3"
pydantic="2.8.2"
openai="1.52.0"
protobuf="4.25.3"
rich="13.7.1"
tomli="2.0.1"
executing="2.0.1"
opentelemetry-api="1.25.0"
opentelemetry-exporter-otlp-proto-common="1.25.0"
opentelemetry-exporter-otlp-proto-http="1.25.0"
opentelemetry-instrumentation="0.46b0"
opentelemetry-proto="1.25.0"
opentelemetry-sdk="1.25.0"
opentelemetry-semantic-conventions="0.46b0"Metadata
Metadata
Assignees
Labels
bugBug related to the Logfire Python SDKBug related to the Logfire Python SDK