feat: Add AI SDK v6 support#307
Conversation
- Update package dependencies to AI SDK v6 beta versions - Migrate ProviderV2 → ProviderV3 - Migrate LanguageModelV2 → LanguageModelV3 - Migrate EmbeddingModelV2 → EmbeddingModelV3 - Update all type imports to V3 equivalents - Add comprehensive implementation documentation Known issues: - Usage structures need migration to V3 nested format - Test files need V3 type updates - Build currently fails due to usage type mismatches See AI-SDK-V6-IMPLEMENTATION.md for full details
- Migrate from ProviderV2 to ProviderV3
- Migrate from LanguageModelV2 to LanguageModelV3
- Migrate from EmbeddingModelV2 to EmbeddingModelV3
- Update specificationVersion to 'v3'
- Update usage structure to nested v6 format:
- inputTokens: { total, noCache, cacheRead, cacheWrite }
- outputTokens: { total, text, reasoning }
- Update all type imports (SharedV2* -> SharedV3*, etc.)
- Update dependencies:
- @ai-sdk/provider: 2.0.0 -> 3.0.0-beta.27
- @ai-sdk/provider-utils: 3.0.1 -> 4.0.0-beta.53
- ai peer dep: ^6.0.0
Tested with free models on OpenRouter:
- Basic text generation ✅
- Streaming ✅
- Tool calling ✅
- Reasoning tokens ✅
- Multi-turn conversations ✅
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
- Remove documentation files from PR - Fix all V2→V3 type migrations in source files - Add MSW-based test server utility to replace removed createTestServer - Migrate all test files to V3 types - Fix JSONSchema7 type issues in tests - Fix requestBody unknown type issues - Typecheck now passes with 0 errors - Build succeeds - Live tests pass (basic, streaming, tool calling, reasoning, multi-turn)
|
final v6 is released. would be great if this got updated, reviewed, merged, and released. |
|
would this constitute a breaking change, or are the types backwards compatible? |
|
https://ai-sdk.dev/docs/migration-guides/migration-guide-6-0 check here, I think they are compatible. |
|
would be great to see this PR move forward so we can start using AI SDK v6 |
|
Any update on this? Relying totally on open router right now. and using v6 at the same time. |
|
Just migrated our app to AI SDK v6 using this PR👍 |
- Update @ai-sdk/provider from 3.0.0-beta.27 to 3.0.0 - Update @ai-sdk/provider-utils from 4.0.0-beta.53 to 4.0.1 - Update ai from 6.0.0-beta.159 to 6.0.3 - Update tests for v6 usage format (nested inputTokens/outputTokens) - Update tests for v3 embedding specification version - Fix header tests for extended user-agent format
Updated to stable AI SDK v6 releasesI've updated the dependencies to use stable versions since v6 is now released:
All tests pass with the stable versions. |
The-Best-Codes
left a comment
There was a problem hiding this comment.
Really hoping this can get finished up & merged soon! Would love to use OpenRouter with AI SDK in my applications! 🙌
|
Yeah this is what's blocking our AI SDK v6 migration. |
|
+1 |
|
There is |
|
Any update? 🙏 |
|
No updates as user with enterprise plan i really need this v6 for agents and here we are i know the team is doing a lot of efforts, but 2 weeks and no single commit or update here is crazy |
|
Tested locally with AI SDK v6. Found a critical issue: LanguageModelV3FinishReason type mismatch In AI SDK v6, once this is fixed we can merge and work on a release. Thanks for this! |
|
Thank you @mattapperson |
In AI SDK v6, the LanguageModelV3FinishReason type changed from a string
to an object with `unified` and `raw` properties. This change ensures
the provider returns the correct format:
- `unified`: The normalized finish reason (stop, length, tool-calls, etc.)
- `raw`: The original finish reason string from the API
Changes:
- Update mapOpenRouterFinishReason to return { unified, raw } object
- Add createFinishReason helper for creating finish reasons
- Update chat and completion streaming to use new format
- Update all tests to expect object format
Fixed: LanguageModelV3FinishReason typeThanks @mattapperson for catching this issue! The {
unified: 'stop' | 'length' | 'content-filter' | 'tool-calls' | 'error' | 'other',
raw: string | undefined // Original finish reason from OpenRouter
}All tests pass with the stable AI SDK v6 releases. |
|
Thanks @pablof7z |
|
Any update? 🙏 |
|
@nvti Maintainers likely are on vacation mode until tomorrow, so I wouldn't expect to see anything until at least then. |
|
@subtleGradient Thank you so much |
|
Thank youuuu!! |
|
Any update? |
|
great minds think alike but I have a gift for you: https://github.com/OpenRouterTeam/ai-sdk-provider/releases/tag/v6.0.0-alpha.1 |
|
@subtleGradient LOVE YOUU LOL |
|
@subtleGradient Thank you so much! Happy 2026❤️ |
Thanks, but having trouble enabling reasoning options. |
|
@subtleGradient Thank you so much!!! |
- Bump version to 2.0.0 - Fix zod peer dependency (^3.25.0 || ^4.0.0) - Make file content part fields optional and add file_id - Handle LanguageModelV3ToolApprovalResponsePart in tool responses - Apply biome formatting fixes
|
Thank you @pablof7z and everyone else for your help on this! |
* feat: Initial AI SDK v6 support implementation
- Update package dependencies to AI SDK v6 beta versions
- Migrate ProviderV2 → ProviderV3
- Migrate LanguageModelV2 → LanguageModelV3
- Migrate EmbeddingModelV2 → EmbeddingModelV3
- Update all type imports to V3 equivalents
- Add comprehensive implementation documentation
Known issues:
- Usage structures need migration to V3 nested format
- Test files need V3 type updates
- Build currently fails due to usage type mismatches
See AI-SDK-V6-IMPLEMENTATION.md for full details
* feat: Complete AI SDK v6 support
- Migrate from ProviderV2 to ProviderV3
- Migrate from LanguageModelV2 to LanguageModelV3
- Migrate from EmbeddingModelV2 to EmbeddingModelV3
- Update specificationVersion to 'v3'
- Update usage structure to nested v6 format:
- inputTokens: { total, noCache, cacheRead, cacheWrite }
- outputTokens: { total, text, reasoning }
- Update all type imports (SharedV2* -> SharedV3*, etc.)
- Update dependencies:
- @ai-sdk/provider: 2.0.0 -> 3.0.0-beta.27
- @ai-sdk/provider-utils: 3.0.1 -> 4.0.0-beta.53
- ai peer dep: ^6.0.0
Tested with free models on OpenRouter:
- Basic text generation ✅
- Streaming ✅
- Tool calling ✅
- Reasoning tokens ✅
- Multi-turn conversations ✅
* fix: Complete AI SDK v6 type migration
- Remove documentation files from PR
- Fix all V2→V3 type migrations in source files
- Add MSW-based test server utility to replace removed createTestServer
- Migrate all test files to V3 types
- Fix JSONSchema7 type issues in tests
- Fix requestBody unknown type issues
- Typecheck now passes with 0 errors
- Build succeeds
- Live tests pass (basic, streaming, tool calling, reasoning, multi-turn)
* chore: update to stable AI SDK v6 releases
- Update @ai-sdk/provider from 3.0.0-beta.27 to 3.0.0
- Update @ai-sdk/provider-utils from 4.0.0-beta.53 to 4.0.1
- Update ai from 6.0.0-beta.159 to 6.0.3
- Update tests for v6 usage format (nested inputTokens/outputTokens)
- Update tests for v3 embedding specification version
- Fix header tests for extended user-agent format
* fix: Return LanguageModelV3FinishReason as object { unified, raw }
In AI SDK v6, the LanguageModelV3FinishReason type changed from a string
to an object with `unified` and `raw` properties. This change ensures
the provider returns the correct format:
- `unified`: The normalized finish reason (stop, length, tool-calls, etc.)
- `raw`: The original finish reason string from the API
Changes:
- Update mapOpenRouterFinishReason to return { unified, raw } object
- Add createFinishReason helper for creating finish reasons
- Update chat and completion streaming to use new format
- Update all tests to expect object format
* fix: address review feedback and merge main branch fixes
- Bump version to 2.0.0
- Fix zod peer dependency (^3.25.0 || ^4.0.0)
- Make file content part fields optional and add file_id
- Handle LanguageModelV3ToolApprovalResponsePart in tool responses
- Apply biome formatting fixes
* chore: pin devDependency versions
* chore: pin ai peer dep
* docs: update README for AI SDK v6 release
---------
Co-authored-by: Robert Yeakel <212159665+robert-j-y@users.noreply.github.com>
This commit addresses issue #186 where `bun install` shows "warn: incorrect peer dependency ai@6.0.86" warning. Root cause: @openrouter/ai-sdk-provider@1.5.4 requires ai@^5.0.0 as peer dependency, but @link-assistant/agent uses ai@^6.0.1. Solution: - Update @openrouter/ai-sdk-provider from ^1.5.4 to ^2.2.3 (version 2.0.0+ supports AI SDK v6) - Update @opentui/core from ^0.1.46 to ^0.1.79 - Update @opentui/solid from ^0.1.46 to ^0.1.79 Note: The solid-js peer dependency warning remains due to upstream issue in @opentui/solid which uses exact version pinning (1.9.9). Reported at: anomalyco/opentui#689 Added case study documentation in docs/case-studies/issue-186/ References: - OpenRouterTeam/ai-sdk-provider#307 - https://github.com/OpenRouterTeam/ai-sdk-provider/releases/tag/2.0.0 Fixes #186 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

Summary
This PR adds support for AI SDK v6 (currently in beta).
Changes
ProviderV2toProviderV3LanguageModelV2toLanguageModelV3EmbeddingModelV2toEmbeddingModelV3specificationVersionto'v3'SharedV2*->SharedV3*, etc.)@ai-sdk/provider:2.0.0->3.0.0-beta.27@ai-sdk/provider-utils:3.0.1->4.0.0-beta.53aipeer dep:^6.0.0Testing
Tested with free models on OpenRouter:
Notes
AI SDK v6 is currently in beta. This PR tracks the beta versions. Once v6 is stable, dependency versions can be updated to stable releases.