Skip to content

[bug] Langchain OpenInference Trace output.value not a json dump of a BaseMessage #1401

Open
@njbrake

Description

@njbrake

Hi! Please let me know if I'm missing something silly with how I should be reloading the lanchain message from the openinference trace, it's possible that I am misunderstanding the intended use of the trace.

Describe the bug
If I export a Langchain trace and look at the output value from the final "openinference.span.kind": "AGENT", I can see that the output.value is a serialized list of LangChain Messages. However, The message doesn't seem to be serialized into something that I can re-load into a langchain BaseMessage:

The output in the openinference trace looks like:

      "output.value": "{\"messages\": [\"content='My output content here.' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 1040, 'prompt_tokens': 2190, 'total_tokens': 3230, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 896, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 1920}}, 'model_name': 'o3-mini-2025-01-31', 'system_fingerprint': 'fp_42bfad963b', 'id': 'chatcmpl-BDE6vvr0h4Z0xOF5bMaEiji8yiREa', 'finish_reason': 'stop', 'logprobs': None} id='run-7ab1854f-7bd7-40a2-9cc0-3c79d01d6d57-0' usage_metadata={'input_tokens': 2190, 'output_tokens': 1040, 'total_tokens': 3230, 'input_token_details': {'audio': 0, 'cache_read': 1920}, 'output_token_details': {'audio': 0, 'reasoning': 896}}\"]}",

With that object, I can successfully do a output_value=json.loads(span['attributes']['output.value'] and then access the first message via output_value['messages'][0]. But after that, I can neither do a json.loads of the message (because it's serialized with '=' separating key/value pairs) nor a BaseMessage(**output_value['messages'][0]), because langchain expects a dict, not a string.

To Reproduce
Create a trace with a langchain agent, export it locally, and look at the last span that has "openinference.span.kind": "AGENT".

Expected behavior
I would expect that if openinference is serializing a langchain message, it would provide some mechanism to load and recover the langchain message object.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • macOS

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinglanguage: pythonRelated to Python integrationneeds attentionThe issue or PR requires a person to take a look and respond

    Type

    No type

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions