-
Notifications
You must be signed in to change notification settings - Fork 203
Add baseten integration #389
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Hey @philipkiely-baseten, That said, we absolutely want to support your integration! We'd recommend one of these approaches:
Either way, we'd be happy to feature you on our documentation page as a supported model provider, giving you visibility to our community. |
from typing_extensions import Unpack, override | ||
|
||
from ..types.content import Messages | ||
from ..types.models import OpenAIModel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This import path is out of date.
return cast(BasetenModel.BasetenConfig, self.config) | ||
|
||
@override | ||
def stream(self, request: dict[str, Any]) -> Iterable[dict[str, Any]]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also here.
elif "base_url" in self.config: | ||
client_args["base_url"] = self.config["base_url"] | ||
|
||
self.client = openai.OpenAI(**client_args) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We've migrated to AsyncOpenAI in our implementation. Please verify this change is properly reflected throughout the codebase in your PR. Also, ensure you've pulled the most recent code before proceeding with your review.
Returns: | ||
An iterable of response events from the Baseten model. | ||
""" | ||
response = self.client.chat.completions.create(**request) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
async happens also here ^^
yield {"chunk_type": "metadata", "data": event.usage} | ||
|
||
@override | ||
def structured_output( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you might want to update async here
Description
Adds Baseten as a model provider
Related Issues
Documentation PR
strands-agents/docs#124
Type of Change
New feature
Testing
How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli
hatch run prepare
Checklist
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.