Skip to content

chore(llmobs/langchain): stop using in-process vcr #13935

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

Kyle-Verhoog
Copy link
Member

I suspect that the in-process VCR is leading to flaky tests as:

  • VCR patches all http connections including possibly those to the test agent; and
  • I don't trust in-process VCR monkeypatching very much

Instead, we can continue to use VCR but out of process by proxying through the test agent which now supports a VCR feature.

Checklist

  • PR author has checked that all the criteria below are met
  • The PR description includes an overview of the change
  • The PR description articulates the motivation for the change
  • The change includes tests OR the PR description describes a testing strategy
  • The PR description notes risks associated with the change, if any
  • Newly-added code is easy to change
  • The change follows the library release note guidelines
  • The change includes or references documentation updates if necessary
  • Backport labels are set (if applicable)

Reviewer Checklist

  • Reviewer has checked that all the criteria below are met
  • Title is accurate
  • All changes are related to the pull request's stated goal
  • Avoids breaking API changes
  • Testing strategy adequately addresses listed risks
  • Newly-added code is easy to change
  • Release note makes sense to a user of the library
  • If necessary, author has acknowledged and discussed the performance implications of this PR as reported in the benchmarks PR comment
  • Backport labels are set in a manner that is consistent with the release branch maintenance policy

@Kyle-Verhoog Kyle-Verhoog added the changelog/no-changelog A changelog entry is not required for this PR. label Jul 10, 2025
Copy link
Contributor

github-actions bot commented Jul 10, 2025

CODEOWNERS have been resolved as:

tests/llmobs/llmobs_cassettes/openai/openai_chat_completions_post_e16811d2.yaml  @DataDog/ml-observability
tests/llmobs/llmobs_cassettes/openai/openai_completions_post_7b116e69.yaml  @DataDog/ml-observability
tests/llmobs/llmobs_cassettes/openai/openai_completions_post_8b80b4c0.yaml  @DataDog/ml-observability
tests/llmobs/llmobs_cassettes/openai/openai_completions_post_a23a180e.yaml  @DataDog/ml-observability
tests/llmobs/llmobs_cassettes/openai/openai_completions_post_b2bf069e.yaml  @DataDog/ml-observability
tests/contrib/langchain/conftest.py                                     @DataDog/ml-observability
tests/contrib/langchain/test_langchain.py                               @DataDog/ml-observability
tests/contrib/langchain/test_langchain_llmobs.py                        @DataDog/ml-observability

@Kyle-Verhoog Kyle-Verhoog force-pushed the kylev/langchain-death-flaky branch from 1945240 to 07862ad Compare July 10, 2025 04:23
@Kyle-Verhoog Kyle-Verhoog marked this pull request as ready for review July 10, 2025 04:24
@Kyle-Verhoog Kyle-Verhoog requested a review from a team as a code owner July 10, 2025 04:24
Copy link
Contributor

github-actions bot commented Jul 10, 2025

Bootstrap import analysis

Comparison of import times between this PR and base.

Summary

The average import time from this PR is: 286 ± 6 ms.

The average import time from base is: 286 ± 6 ms.

The import time difference between this PR and base is: -0.8 ± 0.3 ms.

Import time breakdown

The following import paths have shrunk:

ddtrace.auto 1.883 ms (0.66%)
ddtrace.bootstrap.sitecustomize 1.211 ms (0.42%)
ddtrace.bootstrap.preload 1.211 ms (0.42%)
ddtrace.internal.remoteconfig.client 0.639 ms (0.22%)
ddtrace 0.672 ms (0.24%)
ddtrace.internal._unpatched 0.028 ms (0.01%)
json 0.028 ms (0.01%)
json.decoder 0.028 ms (0.01%)
re 0.028 ms (0.01%)
enum 0.028 ms (0.01%)
types 0.028 ms (0.01%)

@pr-commenter
Copy link

pr-commenter bot commented Jul 10, 2025

Benchmarks

Benchmark execution time: 2025-07-10 14:27:38

Comparing candidate commit a5a7065 in PR branch kylev/langchain-death-flaky with baseline commit b17d958 in branch main.

Found 0 performance improvements and 2 performance regressions! Performance is the same for 516 metrics, 2 unstable metrics.

scenario:iastaspects-lstrip_aspect

  • 🟥 execution_time [+748.612ns; +835.323ns] or [+7.148%; +7.976%]

scenario:iastaspectsospath-ospathsplitext_aspect

  • 🟥 execution_time [+668.518ns; +780.800ns] or [+14.584%; +17.034%]

I suspect that the in-process VCR is leading to flaky tests as:

- VCR patches all http connections including possibly those to the test agent; and
- I don't trust in-process VCR monkeypatching very much

Instead, we can continue to use VCR but out of process by proxying through the
test agent which now supports a VCR feature.
@Kyle-Verhoog Kyle-Verhoog force-pushed the kylev/langchain-death-flaky branch from 07862ad to a5a7065 Compare July 10, 2025 13:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
changelog/no-changelog A changelog entry is not required for this PR.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant