You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs: add cover image and simplify quickstart documentation (#135)
# Enhanced README with Visual Elements and Improved Documentation Structure
This PR adds visual elements and restructures the documentation to make it more user-friendly:
- Added a cover image to the README for better visual identity
- Simplified the quickstart section to focus on the HTTP transport (30-second setup)
- Removed the overview section as its content was redundant with other sections
- Created a dedicated `core-package.md` documentation file with comprehensive guidance on using Bifrost as a Go package
- Reorganized the "Getting Started" section to clearly distinguish between the two main usage patterns:
1. As a Go package (core integration)
2. As an HTTP API (transport layer)
- Added feature highlighting SDK support and seamless integration with existing Generative AI SDKs
- Included demo video for the package integration
- Improved formatting and navigation throughout the documentation
These changes make the documentation more accessible while maintaining all the essential information for users.
Bifrost is an open-source middleware that serves as a unified gateway to various AI model providers, enabling seamless integration and fallback mechanisms for your AI-powered applications.
6
6
7
-
## ⚡ Quickstart
7
+

8
+
9
+
## ⚡ Quickstart (30 seconds)
8
10
9
11
### Prerequisites
10
12
11
13
- Go 1.23 or higher (not needed if using Docker)
12
14
- Access to at least one AI model provider (OpenAI, Anthropic, etc.)
13
15
- API keys for the providers you wish to use
14
16
15
-
### A. Using Bifrost as an HTTP Server
17
+
### Using Bifrost HTTP Transport
16
18
17
19
1.**Create `config.json`**: This file should contain your provider settings and API keys.
18
20
@@ -36,7 +38,6 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
36
38
37
39
```bash
38
40
export OPENAI_API_KEY=your_openai_api_key
39
-
export ANTHROPIC_API_KEY=your_anthropic_api_key
40
41
```
41
42
42
43
Note: Ensure you add all variables stated in your `config.json` file.
@@ -73,7 +74,6 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
73
74
docker run -p 8080:8080 \
74
75
-v $(pwd)/config.json:/app/config/config.json \
75
76
-e OPENAI_API_KEY \
76
-
-e ANTHROPIC_API_KEY \
77
77
maximhq/bifrost
78
78
```
79
79
@@ -88,93 +88,29 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
88
88
"provider": "openai",
89
89
"model": "gpt-4o-mini",
90
90
"messages": [
91
-
{"role": "system", "content": "You are a helpful assistant."},
92
91
{"role": "user", "content": "Tell me about Bifrost in Norse mythology."}
93
92
]
94
93
}'
95
94
```
96
95
97
-
For additional HTTP server configuration options, read [this](https://github.com/maximhq/bifrost/blob/main/transports/README.md).
98
-
99
-
### B. Using Bifrost as a Go Package
100
-
101
-
1. **Implement Your Account Interface**: First, create an account that follows [Bifrost's account interface](https://github.com/maximhq/bifrost/blob/main/core/schemas/account.go).
Bifrost uses these methods to get all the keys and configurations it needs to call the providers. See the [Additional Configurations](#additional-configurations) section for additional customization options.
129
-
130
-
2. **Initialize Bifrost**: Set up the Bifrost instance by providing your account implementation.
@@ -186,20 +122,6 @@ For additional HTTP server configuration options, read [this](https://github.com
186
122
187
123
---
188
124
189
-
## 🔍 Overview
190
-
191
-
Bifrost acts as a bridge between your applications and multiple AI providers (OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, etc.). It provides a consistent API while handling:
192
-
193
-
- Authentication and key management
194
-
- Request routing and load balancing
195
-
- Fallback mechanisms for reliability
196
-
- Unified request and response formatting
197
-
- Connection pooling and concurrency control
198
-
199
-
With Bifrost, you can focus on building your AI-powered applications without worrying about the underlying provider-specific implementations. It handles all the complexities of key and provider management, providing a fixed input and output format so you don't need to modify your codebase for different providers.
200
-
201
-
---
202
-
203
125
## ✨ Features
204
126
205
127
- **Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
@@ -212,6 +134,8 @@ With Bifrost, you can focus on building your AI-powered applications without wor
212
134
- **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
213
135
- **Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
214
136
- **Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
137
+
- **SDK Support**: Bifrost is available as a Go package, so you can use it directly in your own applications.
138
+
- **Seamless Integration with Generative AI SDKs**: Effortlessly transition to Bifrost by simply updating the `base_url` in your existing SDKs, such as OpenAI, Anthropic, GenAI, and more. Just one line of code is all it takes to make the switch.
│ ├── bifrost-http/ # HTTP transport implementation
160
+
│ ├── bifrost-http/ # HTTP transport implementation
237
161
│ └── ...
238
162
│
239
163
└── plugins/ # Plugin Implementations
@@ -247,40 +171,34 @@ The system uses a provider-agnostic approach with well-defined interfaces to eas
247
171
248
172
## 🚀 Getting Started
249
173
250
-
If you want to **set up the Bifrost API quickly**, [check the transports documentation](https://github.com/maximhq/bifrost/tree/main/transports/README.md).
251
-
252
-
### Package Structure
174
+
There are two main ways to use Bifrost:
253
175
254
-
Bifrost is divided into three Go packages: core, plugins, and transports.
176
+
### 1. As a Go Package (Core Integration)
255
177
256
-
1. **core**: This package contains the core implementation of Bifrost as a Go package.
257
-
2. **plugins**: This package serves as an extension to core. You can download individual packages using `go get github.com/maximhq/bifrost/plugins/{plugin-name}` and pass the plugins while initializing Bifrost.
178
+
For direct integration into your Go applications, use Bifrost as a package. This provides the most flexibility and control over your AI model interactions.
258
179
259
-
```golang
260
-
// go get github.com/maximhq/bifrost/plugins/maxim
3. **transports**: This package contains transport clients like HTTP to expose your Bifrost client. You can either `go get` this package or directly use the independent Dockerfile to quickly spin up your [Bifrost API](https://github.com/maximhq/bifrost/tree/main/transports/README.md) (read more on this).
188
+
### 2. As an HTTP API (Transport Layer)
275
189
276
-
### Additional Configurations
190
+
For quick setup and language-agnostic integration, use the HTTP transport layer.
0 commit comments