Skip to content

Conversation

jordan-wong
Copy link

What does this PR do?

Motivation

Plugin Checklist

Additional Notes

Copy link

github-actions bot commented Oct 2, 2025

Overall package size

Self size: 12.52 MB
Deduped: 112.74 MB
No deduping: 113.13 MB

Dependency sizes | name | version | self size | total size | |------|---------|-----------|------------| | @datadog/libdatadog | 0.7.0 | 35.02 MB | 35.02 MB | | @datadog/native-appsec | 10.2.1 | 20.64 MB | 20.65 MB | | @datadog/native-iast-taint-tracking | 4.0.0 | 11.72 MB | 11.73 MB | | @datadog/pprof | 5.10.0 | 9.91 MB | 10.3 MB | | @opentelemetry/core | 1.30.1 | 908.66 kB | 7.16 MB | | protobufjs | 7.5.4 | 2.95 MB | 5.73 MB | | @datadog/wasm-js-rewriter | 4.0.1 | 2.85 MB | 3.58 MB | | @datadog/native-metrics | 3.1.1 | 1.02 MB | 1.43 MB | | @opentelemetry/api | 1.9.0 | 1.22 MB | 1.22 MB | | jsonpath-plus | 10.3.0 | 617.18 kB | 1.08 MB | | import-in-the-middle | 1.14.4 | 123.18 kB | 851.76 kB | | lru-cache | 10.4.3 | 804.3 kB | 804.3 kB | | opentracing | 0.14.7 | 194.81 kB | 194.81 kB | | source-map | 0.7.6 | 185.63 kB | 185.63 kB | | pprof-format | 2.2.1 | 163.06 kB | 163.06 kB | | @datadog/sketches-js | 2.1.1 | 109.9 kB | 109.9 kB | | lodash.sortby | 4.7.0 | 75.76 kB | 75.76 kB | | ignore | 7.0.5 | 63.38 kB | 63.38 kB | | istanbul-lib-coverage | 3.2.2 | 34.37 kB | 34.37 kB | | rfdc | 1.4.1 | 27.15 kB | 27.15 kB | | dc-polyfill | 0.1.10 | 26.73 kB | 26.73 kB | | @isaacs/ttlcache | 1.4.1 | 25.2 kB | 25.2 kB | | tlhunter-sorted-set | 0.1.0 | 24.94 kB | 24.94 kB | | shell-quote | 1.8.3 | 23.74 kB | 23.74 kB | | limiter | 1.1.5 | 23.17 kB | 23.17 kB | | retry | 0.13.1 | 18.85 kB | 18.85 kB | | semifies | 1.0.0 | 15.84 kB | 15.84 kB | | jest-docblock | 29.7.0 | 8.99 kB | 12.76 kB | | crypto-randomuuid | 1.0.0 | 11.18 kB | 11.18 kB | | ttl-set | 1.0.0 | 4.61 kB | 9.69 kB | | mutexify | 1.4.0 | 5.71 kB | 8.74 kB | | path-to-regexp | 0.1.12 | 6.6 kB | 6.6 kB | | module-details-from-path | 1.0.4 | 3.96 kB | 3.96 kB |

🤖 This report was automatically generated by heaviest-objects-in-the-universe

Copy link

codecov bot commented Oct 2, 2025

Codecov Report

❌ Patch coverage is 20.00000% with 16 lines in your changes missing coverage. Please review.
✅ Project coverage is 82.88%. Comparing base (5875fec) to head (db5778f).
⚠️ Report is 224 commits behind head on master.

Files with missing lines Patch % Lines
packages/dd-trace/src/llmobs/plugins/openai.js 20.00% 16 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #6583      +/-   ##
==========================================
- Coverage   83.17%   82.88%   -0.30%     
==========================================
  Files         477      484       +7     
  Lines       19739    20361     +622     
==========================================
+ Hits        16418    16876     +458     
- Misses       3321     3485     +164     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@pr-commenter
Copy link

pr-commenter bot commented Oct 2, 2025

Benchmarks

Benchmark execution time: 2025-10-10 21:11:02

Comparing candidate commit 0579f5e in PR branch openai-responses-instrumentation with baseline commit 5875fec in branch master.

Found 0 performance improvements and 0 performance regressions! Performance is the same for 1275 metrics, 48 unstable metrics.

Comment on lines 25 to 31
{
file: 'resources/responses',
targetClass: 'Responses',
baseResource: 'responses',
methods: ['create'],
streamedResponse: false
},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
{
file: 'resources/responses',
targetClass: 'Responses',
baseResource: 'responses',
methods: ['create'],
streamedResponse: false
},

baseResource: 'responses',
methods: ['create'],
streamedResponse: false,
versions: ['>=4.85.0']
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
versions: ['>=4.85.0']
versions: ['>=4.87.0']

targetClass: 'Responses',
baseResource: 'responses',
methods: ['create'],
streamedResponse: false,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
streamedResponse: false,
streamedResponse: true,

Comment on lines 156 to 172
function wrapCreate (create) {
return function (request) {
if (!vertexaiTracingChannel.start.hasSubscribers) {
// calls the original function
return create.apply(this, arguments)
}

const ctx = {
request,
instance: this,
resource: [this.constructor.name, create.name].join('.')
}
// am I using the right channel? tracingChannel vs diagnostics channel
return ch.tracePromise(create, ctx, this, ...arguments)
}
}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
function wrapCreate (create) {
return function (request) {
if (!vertexaiTracingChannel.start.hasSubscribers) {
// calls the original function
return create.apply(this, arguments)
}
const ctx = {
request,
instance: this,
resource: [this.constructor.name, create.name].join('.')
}
// am I using the right channel? tracingChannel vs diagnostics channel
return ch.tracePromise(create, ctx, this, ...arguments)
}
}

Comment on lines 196 to 205
//register patching hooks via addHook
addHook({ name: 'openai', file: 'resources/responses.js', versions: ['>=4.87.0'] }, exports => {
const Responses = exports.OpenAIApi.responses
// wrap functions on module exports with shimmer.wrap
shimmer.wrap(responses.prototype, 'responses.createResponse', wrapCreate)
return exports
})



Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
//register patching hooks via addHook
addHook({ name: 'openai', file: 'resources/responses.js', versions: ['>=4.87.0'] }, exports => {
const Responses = exports.OpenAIApi.responses
// wrap functions on module exports with shimmer.wrap
shimmer.wrap(responses.prototype, 'responses.createResponse', wrapCreate)
return exports
})

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we'll need a branch here to handle the streamed responses result, but we should be able to grab the last chunk like this

const lastChunk = chunks[chunks.length - 1].response

and then grab usage and output off of that, which should all be combined

if (reasoning) {
metadata.reasoning = reasoning
}
if (background !== undefined) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if (background !== undefined) {
if (background) {

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants