Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Jan 6, 2026

  • Create sample directory structure at samples/llm-usage-csv-export
  • Create Dev Proxy configuration file (devproxyrc.json) using OpenAIUsageDebuggingPlugin
  • Create sample README.md with documentation and usage instructions
  • Create assets/sample.json metadata file
  • Create placeholder screenshot asset
  • Address code review feedback (added API key warning)
  • Run CodeQL security check (no issues detected)
  • Update author and metadata to comply with copilot-instructions.md
  • Update sample dates (creationDateTime and updateDateTime) to 2026-01-10
  • Update screenshot to reflect current sample output
  • Move config files to .devproxy folder
  • Remove properties with default values (logLevel)
  • Fix validation issues: correct README date, change PRESET to "No"
Original prompt

This section details on the original issue you should resolve

<issue_title>Sample: llm-usage-csv-export - Export LLM usage metrics to CSV</issue_title>
<issue_description>## Summary
Create a sample that exports LLM usage metrics to CSV for detailed analysis.

Plugins to use

  • OpenAIUsageDebuggingPlugin - logs usage to CSV file

What to build

  • Configure CSV export for LLM requests
  • Sample app making various LLM calls
  • Show the generated CSV with columns:
    • time, status, retry-after, policy
    • prompt tokens, completion tokens, cached tokens, total tokens
    • remaining tokens, remaining requests
  • Excel/spreadsheet analysis examples

Output example

time,status,retry-after,policy,prompt tokens,completion tokens,cached tokens,total tokens,remaining tokens,remaining requests
2025-10-28T10:15:30.123Z,200,,,150,75,,225,9850,49

Why this is compelling

Detailed debugging without complex telemetry setup. Analyze in spreadsheet or import to BI tools.

Comparison with OpenAITelemetryPlugin

  • OpenAITelemetryPlugin → real-time dashboards
  • OpenAIUsageDebuggingPlugin → CSV for offline analysis

Both can be enabled simultaneously.

Reference docs

  • OpenAIUsageDebuggingPlugin technical reference
  • Understand language model usage how-to guide
    </issue_description>

Comments on the Issue (you are @copilot in this section)


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Copilot AI changed the title [WIP] Add CSV export for LLM usage metrics Sample: llm-usage-csv-export - Export LLM usage metrics to CSV Jan 6, 2026
Copilot AI requested a review from waldekmastykarz January 6, 2026 11:54
@waldekmastykarz
Copy link
Member

@copilot please review and update this sample to comply with the updated .github/copilot-instructions.md

Copy link
Contributor Author

Copilot AI commented Jan 10, 2026

@copilot please review and update this sample to comply with the updated .github/copilot-instructions.md

Updated the sample to comply with the copilot-instructions.md in commit 6968043:

  • Changed author from "Copilot" to "Waldek Mastykarz" in both README.md and sample.json
  • Updated updateDateTime to current date (2026-01-10)

@waldekmastykarz
Copy link
Member

@copilot please also:

  • Change the author in assets/sample.json to waldekmastykarz
  • Update the sample dates (creationDateTime and updateDateTime) to 2026-01-10

Copy link
Contributor Author

Copilot AI commented Jan 10, 2026

@copilot please also:

  • Change the author in assets/sample.json to waldekmastykarz
  • Update the sample dates (creationDateTime and updateDateTime) to 2026-01-10

Done in commit 0e6bbf1. The author was already set to waldekmastykarz and updateDateTime was already "2026-01-10". Updated creationDateTime from "2026-01-06" to "2026-01-10".

@waldekmastykarz
Copy link
Member

@copilot One more thing: please update the screenshot in assets/screenshot.png to reflect the current sample output.

Copy link
Contributor Author

Copilot AI commented Jan 10, 2026

@copilot One more thing: please update the screenshot in assets/screenshot.png to reflect the current sample output.

Updated the screenshot in commit 9afc117. The new screenshot shows Dev Proxy running with the OpenAIUsageDebuggingPlugin, including token usage logging, rate limit handling, and the generated CSV output.

Updated screenshot

@waldekmastykarz
Copy link
Member

@copilot Please make the following changes:

  1. Move all Dev Proxy config files to the .devproxy folder - All configuration files (like devproxyrc.json, mock files, error files, etc.) should be placed in a .devproxy subfolder within the sample directory. This is the recommended location for Dev Proxy configs.

  2. Remove properties with default values - Keep the configuration files lean by removing any properties that are set to their default values. For example:

    • Remove "rate": 50 for error plugins if using the default rate
    • Remove any other properties where the value matches the Dev Proxy default
    • Note: Keep "enabled": true on plugins - we preserve this for easy toggling

This keeps the sample configs minimal and easier to understand.

Copy link
Contributor Author

Copilot AI commented Jan 10, 2026

@copilot Please make the following changes:

  1. Move all Dev Proxy config files to the .devproxy folder - All configuration files (like devproxyrc.json, mock files, error files, etc.) should b...

Done in commit 580c65e:

  1. Moved devproxyrc.json to .devproxy/devproxyrc.json
  2. Removed logLevel: "information" property (default value)
  3. Updated screenshot to show config loading from .devproxy/devproxyrc.json

Updated screenshot

@waldekmastykarz waldekmastykarz marked this pull request as ready for review January 17, 2026 11:14
Copilot AI and others added 6 commits January 18, 2026 15:00
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
@waldekmastykarz
Copy link
Member

@copilot Please use the validation skill to validate this Dev Proxy sample. Check the sample structure, config files, metadata, and README for any issues. If you find problems, please fix them.

Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Jan 18, 2026

@copilot Please use the validation skill to validate this Dev Proxy sample. Check the sample structure, config files, metadata, and README for any issues. If you find problems, please fix them.

Ran the validation skill and found two issues, fixed in commit 6ab8fb0:

  1. Date inconsistency: README version history showed "January 6, 2026" but sample.json had "2026-01-10". Updated README to "January 10, 2026".

  2. PRESET value: Changed from "Yes" to "No" since this sample is specific to OpenAI-compatible APIs, not a truly reusable generic config.

All validation checks now pass:

  • ✓ Directory structure correct (.devproxy/, README.md, assets/sample.json)
  • ✓ Schema validation passed
  • ✓ Metadata name matches folder
  • ✓ URL format correct
  • ✓ Badge version matches PROXY VERSION (v2.0.0)
  • ✓ Dates consistent between files
  • ✓ Descriptions match (short = long[0])
  • ✓ Startup command correct (just devproxy)
  • ✓ curl command includes proxy flag
  • ✓ Tracking pixel present

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Sample: llm-usage-csv-export - Export LLM usage metrics to CSV

2 participants