Skip to content

Fix BYOK request logging: preserve context across IPC and remove duplicates#3474

Open
zhichli wants to merge 1 commit intomainfrom
zhichli/fixExport
Open

Fix BYOK request logging: preserve context across IPC and remove duplicates#3474
zhichli wants to merge 1 commit intomainfrom
zhichli/fixExport

Conversation

@zhichli
Copy link
Member

@zhichli zhichli commented Feb 5, 2026

Problem

BYOK model requests (Anthropic, Gemini, CustomOAI/Azure via CopilotLanguageModelWrapper) were not being logged correctly in the request log tree:

  1. Context Loss: When requests cross the VS Code IPC boundary via vscode.lm.sendRequest(), the AsyncLocalStorage context is lost. This caused BYOK requests to appear as top-level items instead of children of their prompt items.
Screenshot from 2026-02-04 21-54-11
  1. Duplicate Logging: ExtensionContributedChatEndpoint was logging requests with 0 token usage, while BYOK providers logged the same requests with correct token counts, creating duplicates.

  2. Incorrect Export: The log export logic had workarounds to re-associate orphaned requests, which was fragile.

Solution

Correlation ID Mechanism

  • Added capturingTokenCorrelationMap in requestLogger.ts to store CapturingToken with a correlation ID before IPC
  • Pass correlation ID through modelOptions._capturingTokenCorrelationId
  • BYOK providers retrieve and restore the context after IPC using runWithCapturingToken()

Changes

requestLogger.ts

  • Added storeCapturingTokenForCorrelation() - stores token before IPC
  • Added retrieveCapturingTokenByCorrelation() - retrieves token after IPC
  • Added runWithCapturingToken() - executes code within restored context

extChatEndpoint.ts

  • Store correlation ID before sendRequest() IPC call
  • Removed duplicate logging (BYOK providers handle their own)
  • Removed unused _requestLogger and _endpointProvider dependencies

anthropicProvider.ts / geminiNativeProvider.ts / languageModelAccess.ts

  • Retrieve correlation ID from modelOptions
  • Wrap request execution in restored CapturingToken context

requestLogTree.ts

  • Simplified export logic by removing lastChatRequestItem fallback workaround

Result

  • BYOK requests now appear as children of their prompt items ✅
  • Correct token usage displayed (no more 0 values) ✅
  • No duplicate log entries ✅
  • Cleaner export logic ✅

Verified Examples

  • CAPI
Screenshot from 2026-02-05 00-02-01
  • Anthropic
Screenshot from 2026-02-05 00-02-11
  • Gemini
Screenshot from 2026-02-05 00-02-20
  • Azure/CustomOAI
Screenshot from 2026-02-05 00-08-20

- Add correlation ID mechanism to restore AsyncLocalStorage context after IPC
- BYOK providers (AnthropicBYOK, GeminiNativeBYOK, CopilotLanguageModelWrapper)
  now log requests as children of their prompt items with correct token usage
- Remove duplicate request logging from ExtensionContributedChatEndpoint
- Simplify requestLogTree.ts export logic (remove lastChatRequestItem fallback)
Copilot AI review requested due to automatic review settings February 5, 2026 08:17
@vs-code-engineering vs-code-engineering bot added this to the February 2026 milestone Feb 5, 2026
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes BYOK request logging by implementing a correlation ID mechanism to preserve AsyncLocalStorage context across IPC boundaries, eliminating duplicate log entries and ensuring requests appear correctly as children of their prompt items.

Changes:

  • Added correlation ID mechanism to preserve CapturingToken context across VS Code IPC boundaries
  • Removed duplicate logging from ExtensionContributedChatEndpoint (now handled by BYOK providers)
  • Updated BYOK providers to restore context using correlation IDs
  • Simplified request log export logic by removing workarounds for orphaned requests

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
src/platform/requestLogger/node/requestLogger.ts Added correlation map and helper functions to store/retrieve CapturingToken across IPC
src/platform/endpoint/vscode-node/extChatEndpoint.ts Store correlation ID before IPC call, removed duplicate logging and unused dependencies
src/extension/byok/vscode-node/anthropicProvider.ts Retrieve correlation ID and wrap request execution in restored context
src/extension/byok/vscode-node/geminiNativeProvider.ts Retrieve correlation ID and wrap request execution in restored context
src/extension/conversation/vscode-node/languageModelAccess.ts CopilotLanguageModelWrapper retrieves correlation ID for OpenAI-compatible providers
src/extension/log/vscode-node/requestLogTree.ts Removed workaround logic for re-associating orphaned requests

// BYOK providers (Anthropic, Gemini, CopilotLanguageModelWrapper) handle their own
// logging with correct token usage. Logging here would create duplicates with
// incorrect (0) token counts since we don't have access to actual usage stats.
storeCapturingTokenForCorrelation(ourRequestId);
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potential memory leak: If the request fails before reaching a BYOK provider (e.g., languageModel.sendRequest throws immediately, or if the model is not a BYOK provider), the correlation ID stored here will never be retrieved, leaving entries in capturingTokenCorrelationMap that are never cleaned up.

The correlation ID should be retrieved in a finally block or there should be a timeout-based cleanup mechanism for the correlation map. Alternatively, always attempt to retrieve (and discard) the correlation ID in the catch/finally block of this method to ensure cleanup even when errors occur.

Copilot uses AI. Check for mistakes.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot open a new pull request to apply changes based on this feedback

@alexdima alexdima requested a review from lramos15 February 5, 2026 08:23
Copy link
Contributor

Copilot AI commented Feb 5, 2026

@zhichli I've opened a new pull request, #3475, to work on those changes. Once the pull request is ready, I'll request review from you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants