Skip to content

user guide - models - openai #45

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 23, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion docs/api-reference/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,12 @@
::: strands.models.litellm
options:
heading_level: 2
::: strands.models.llamaapi
options:
heading_level: 2
::: strands.models.ollama
options:
heading_level: 2
::: strands.models.llamaapi
::: strands.models.openai
options:
heading_level: 2
75 changes: 75 additions & 0 deletions docs/user-guide/concepts/model-providers/openai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
# OpenAI

[OpenAI](https://platform.openai.com/docs/overview) is an AI research and deployment company that provides a suite of powerful language models. The Strands Agents SDK implements an OpenAI provider, allowing you to run agents against any OpenAI or OpenAI-compatible model.

## Installation

OpenAI is configured as an optional dependency in Strands Agents. To install, run:

```bash
pip install 'strands-agents[openai]'
```

## Usage

After installing `openai`, you can import and initialize the Strands Agents' OpenAI provider as follows:

```python
from strands import Agent
from strands.models.openai import OpenAIModel
from strands_tools import calculator

model = OpenAIModel(
client_args={
"api_key": "<KEY>",
},
# **model_config
model_id="gpt-4o",
params={
"max_tokens": 1000,
"temperature": 0.7,
}
)

agent = Agent(model=model, tools=[calculator])
response = agent("What is 2+2")
print(response)
```

To connect to a custom OpenAI-compatible server, you will pass in its `base_url` into the `client_args`:

```python
model = OpenAIModel(
client_args={
"api_key": "<KEY>",
"base_url": "<URL>",
},
...
)
```

## Configuration

### Client Configuration

The `client_args` configure the underlying OpenAI client. For a complete list of available arguments, please refer to the OpenAI [source](https://github.com/openai/openai-python).

### Model Configuration

The `model_config` configures the underlying model selected for inference. The supported configurations are:

| Parameter | Description | Example | Options |
|------------|-------------|---------|---------|
| `model_id` | ID of a model to use | `gpt-4o` | [reference](https://platform.openai.com/docs/models)
| `params` | Model specific parameters | `{"max_tokens": 1000, "temperature": 0.7}` | [reference](https://platform.openai.com/docs/api-reference/chat/create)

## Troubleshooting

### Module Not Found

If you encounter the error `ModuleNotFoundError: No module named 'openai'`, this means you haven't installed the `openai` dependency in your environment. To fix, run `pip install 'strands-agents[openai]'`.

## References

- [API](../../../api-reference/models.md)
- [OpenAI](https://platform.openai.com/docs/overview)
1 change: 1 addition & 0 deletions docs/user-guide/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -230,6 +230,7 @@ Strands Agents supports several other model providers beyond Amazon Bedrock:
- **[LiteLLM](concepts/model-providers/litellm.md)** - Unified interface for OpenAI, Mistral, and other providers
- **[Llama API](concepts/model-providers/llamaapi.md)** - Access to Meta's Llama models
- **[Ollama](concepts/model-providers/ollama.md)** - Run models locally for privacy or offline use
- **[OpenAI](concepts/model-providers/openai.md)** - Direct API access to OpenAI or OpenAI-compatible models
- **[Custom Providers](concepts/model-providers/custom_model_provider.md)** - Build your own provider for specialized needs

## Capturing Streamed Data & Events
Expand Down
3 changes: 2 additions & 1 deletion mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -75,8 +75,9 @@ nav:
- Amazon Bedrock: user-guide/concepts/model-providers/amazon-bedrock.md
- Anthropic: user-guide/concepts/model-providers/anthropic.md
- LiteLLM: user-guide/concepts/model-providers/litellm.md
- Ollama: user-guide/concepts/model-providers/ollama.md
- LlamaAPI: user-guide/concepts/model-providers/llamaapi.md
- Ollama: user-guide/concepts/model-providers/ollama.md
- OpenAI: user-guide/concepts/model-providers/openai.md
- Custom Providers: user-guide/concepts/model-providers/custom_model_provider.md
- Streaming:
- Async Iterators: user-guide/concepts/streaming/async-iterators.md
Expand Down