-
Notifications
You must be signed in to change notification settings - Fork 151
feat: Adding portkey.ai gateway as a custom model #197
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Using Portkey with openai 3.5 turbo Screen.Recording.2025-06-10.at.2.12.32.AM.mov |
Docs : strands-agents/docs#87 |
9e10570
to
c790516
Compare
feat: Adding portkey.ai gateway as a custom model feat: Adding portkey.ai gateway as a custom model
cc: @awsarron - Would appreciate your review as many companies use this technique to connect to GenAI models. |
Hi @fede-dash, thank you for your contribution! We're working through Strands' built-in model provider strategy at the moment. To help us with this discussion, could you please share details about what builders get with Portkey over LiteLLM, which already supports hundreds of model providers and has a built-in model provider in Strands. |
Thanks again for your support. Just to clarify—our organization has already
standardized around Portkey as the entry point for GenAI. It handles
routing, observability, caching, and tooling across providers, including
full support for AWS Bedrock. We’re not alone—many other companies, from
startups to Fortune 500s, are using Portkey for the same reasons.
LiteLLM isn’t a feasible alternative for us simply because we’ve already
built on top of Portkey, and maintaining separate forks or custom
integrations per org doesn’t scale.
Adding native Portkey support in Strands would make adoption easier not
just for us, but for many organizations heading in the same direction. We
really appreciate you considering this.
They also note being trusted by roughly 650+ global organizations,
processing trillions of tokens.
Best,
Fede-dash
…On Mon, Jun 16, 2025 at 2:03 AM Arron ***@***.***> wrote:
*awsarron* left a comment (strands-agents/sdk-python#197)
<#197 (comment)>
Hi @fede-dash <https://github.com/fede-dash>, thank you for your
contribution!
We're working through Strands' built-in model provider strategy at the
moment. To help us with this discussion, could you please share details
about what builders get with Portkey over LiteLLM
<https://strandsagents.com/latest/user-guide/concepts/model-providers/litellm/>,
which already supports hundreds of model providers and has a built-in model
provider in Strands.
—
Reply to this email directly, view it on GitHub
<#197 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/BR6UQJF2GRXSMSE2R7HD75D3DZM3BAVCNFSM6AAAAAB66RJP7SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDSNZVGIYDSOBZGY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
7b0b1ee
to
f815280
Compare
🚀 Portkey Integration for Strands SDK
This PR adds first-class support for Portkey — a gateway platform for LLM providers — into the Strands SDK. It introduces a unified, configurable, and extensible way to interact with multiple AI models (OpenAI, Anthropic via Bedrock, Gemini, and more) through a single abstraction.
The integration emphasizes tool use and streaming completions, and lays the foundation for upcoming features like multi-modal interactions (images, audio — coming in a future PR).
🔌 What is Portkey?
Portkey is a powerful LLM gateway that offers:
Portkey is trusted by industry leaders like Notion, Descript, and Ramp as the go-to abstraction for managing complex LLM infrastructure at scale.
✅ Supported Providers
This integration supports — and has been tested with — the following:
🛠️ Tool Use Support
🌟 Benefits of Portkey Integration
tool_calls
across providers🌍 Real-World Usage
Without Portkey support, many real-world systems cannot adopt the SDK due to centralized governance and observability requirements.
Organizations already standardizing on Portkey include:
📎 Reference: Portkey Customers
🧱 Technical Implementation
This PR extends the
Model
interface within the Strands SDK, Amazon’s agent platform for intelligent task automation.Features implemented:
tool_calls
/tool_use
)🆕 New Class:
PortkeyModel
Key responsibilities:
Model
interfaceToolSpec
mapping and schema encoding🧪 Validation
Tested and validated for:
📋 Next Steps
🔗 References
📝 PR Metadata
Description
This PR adds comprehensive documentation for the new Portkey integration in the Strands SDK. It includes installation instructions, usage examples, configuration options, and troubleshooting guidance tailored to help developers quickly adopt and understand Portkey as a provider interface for language models like OpenAI, Anthropic (via Bedrock), and others.
Related Issues
Documentation PR
🔀 Type of Change
🧪 Testing
hatch fmt --linter
hatch fmt --formatter
hatch test --all
agent-docs
agent-tools
agent-cli
✅ Checklist
mkdocs serve
Motivation and Context
Portkey support was recently added to the Strands SDK to provide a unified gateway for multiple LLM providers. This documentation update ensures users can:
It also serves as the foundational guide for future features such as multi-modal support and advanced routing.
Areas Affected
docs/portkey.md
— New section added for Portkey integration:model_id
,params
,provider
, etc.)Screenshots
📸 Not applicable — text-only documentation change. Code block samples included in file.
Additional Notes
This documentation aligns with the broader goals of making the SDK pluggable and provider-agnostic, catering to real-world enterprise environments that already leverage Portkey for observability, routing, and API abstraction.
For broader context and the technical motivations behind this integration, please refer to the parent PR in the SDK:
🔗 Portkey Integration for Strands SDK
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.