Skip to content

Commit 99670c2

Browse files
authored
docs: add cover image and simplify quickstart documentation (#135)
# Enhanced README with Visual Elements and Improved Documentation Structure This PR adds visual elements and restructures the documentation to make it more user-friendly: - Added a cover image to the README for better visual identity - Simplified the quickstart section to focus on the HTTP transport (30-second setup) - Removed the overview section as its content was redundant with other sections - Created a dedicated `core-package.md` documentation file with comprehensive guidance on using Bifrost as a Go package - Reorganized the "Getting Started" section to clearly distinguish between the two main usage patterns: 1. As a Go package (core integration) 2. As an HTTP API (transport layer) - Added feature highlighting SDK support and seamless integration with existing Generative AI SDKs - Included demo video for the package integration - Improved formatting and navigation throughout the documentation These changes make the documentation more accessible while maintaining all the essential information for users.
2 parents c559c93 + ad33158 commit 99670c2

File tree

4 files changed

+240
-114
lines changed

4 files changed

+240
-114
lines changed

README.md

Lines changed: 32 additions & 114 deletions
Original file line numberDiff line numberDiff line change
@@ -4,15 +4,17 @@
44

55
Bifrost is an open-source middleware that serves as a unified gateway to various AI model providers, enabling seamless integration and fallback mechanisms for your AI-powered applications.
66

7-
## ⚡ Quickstart
7+
![Bifrost](./docs/media/cover.png)
8+
9+
## ⚡ Quickstart (30 seconds)
810

911
### Prerequisites
1012

1113
- Go 1.23 or higher (not needed if using Docker)
1214
- Access to at least one AI model provider (OpenAI, Anthropic, etc.)
1315
- API keys for the providers you wish to use
1416

15-
### A. Using Bifrost as an HTTP Server
17+
### Using Bifrost HTTP Transport
1618

1719
1. **Create `config.json`**: This file should contain your provider settings and API keys.
1820

@@ -36,7 +38,6 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
3638

3739
```bash
3840
export OPENAI_API_KEY=your_openai_api_key
39-
export ANTHROPIC_API_KEY=your_anthropic_api_key
4041
```
4142

4243
Note: Ensure you add all variables stated in your `config.json` file.
@@ -73,7 +74,6 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
7374
docker run -p 8080:8080 \
7475
-v $(pwd)/config.json:/app/config/config.json \
7576
-e OPENAI_API_KEY \
76-
-e ANTHROPIC_API_KEY \
7777
maximhq/bifrost
7878
```
7979

@@ -88,93 +88,29 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
8888
"provider": "openai",
8989
"model": "gpt-4o-mini",
9090
"messages": [
91-
{"role": "system", "content": "You are a helpful assistant."},
9291
{"role": "user", "content": "Tell me about Bifrost in Norse mythology."}
9392
]
9493
}'
9594
```
9695

97-
For additional HTTP server configuration options, read [this](https://github.com/maximhq/bifrost/blob/main/transports/README.md).
98-
99-
### B. Using Bifrost as a Go Package
100-
101-
1. **Implement Your Account Interface**: First, create an account that follows [Bifrost's account interface](https://github.com/maximhq/bifrost/blob/main/core/schemas/account.go).
102-
103-
```golang
104-
type BaseAccount struct{}
105-
106-
func (baseAccount *BaseAccount) GetConfiguredProviders() ([]schemas.ModelProvider, error) {
107-
return []schemas.ModelProvider{schemas.OpenAI}, nil
108-
}
109-
110-
func (baseAccount *BaseAccount) GetKeysForProvider(providerKey schemas.ModelProvider) ([]schemas.Key, error) {
111-
return []schemas.Key{
112-
{
113-
Value: os.Getenv("OPENAI_API_KEY"),
114-
Models: []string{"gpt-4o-mini"},
115-
Weight: 1.0,
116-
},
117-
}, nil
118-
}
119-
120-
func (baseAccount *BaseAccount) GetConfigForProvider(providerKey schemas.ModelProvider) (*schemas.ProviderConfig, error) {
121-
return &schemas.ProviderConfig{
122-
NetworkConfig: schemas.DefaultNetworkConfig,
123-
ConcurrencyAndBufferSize: schemas.DefaultConcurrencyAndBufferSize,
124-
}, nil
125-
}
126-
```
127-
128-
Bifrost uses these methods to get all the keys and configurations it needs to call the providers. See the [Additional Configurations](#additional-configurations) section for additional customization options.
129-
130-
2. **Initialize Bifrost**: Set up the Bifrost instance by providing your account implementation.
131-
132-
```golang
133-
account := BaseAccount{}
134-
135-
client, err := bifrost.Init(schemas.BifrostConfig{
136-
Account: &account,
137-
})
138-
```
139-
140-
3. **Use Bifrost**: Make your First LLM Call!
141-
142-
```golang
143-
bifrostResult, bifrostErr := bifrost.ChatCompletionRequest(
144-
context.Background(),
145-
&schemas.BifrostRequest{
146-
Provider: schemas.OpenAI,
147-
Model: "gpt-4o-mini", // make sure you have configured gpt-4o-mini in your account interface
148-
Input: schemas.RequestInput{
149-
ChatCompletionInput: bifrost.Ptr([]schemas.BifrostMessage{{
150-
Role: schemas.ModelChatMessageRoleUser,
151-
Content: schemas.MessageContent{
152-
ContentStr: bifrost.Ptr("What is a LLM gateway?"),
153-
},
154-
}}),
155-
},
156-
},
157-
)
158-
```
96+
**That's it!**, just _4 lines of code_ and you can now use Bifrost to make requests to any provider you have configured.
15997
160-
You can add model parameters by including `Params: &schemas.ModelParameters{...yourParams}` in ChatCompletionRequest.
98+
> For additional HTTP server configuration options, read [this](https://github.com/maximhq/bifrost/blob/main/transports/README.md).
16199
162100
## 📑 Table of Contents
163101
164102
- [Bifrost](#bifrost)
165-
- [⚡ Quickstart](#-quickstart)
103+
- [⚡ Quickstart (30 seconds)](#-quickstart-30-seconds)
166104
- [Prerequisites](#prerequisites)
167-
- [A. Using Bifrost as an HTTP Server](#a-using-bifrost-as-an-http-server)
105+
- [Using Bifrost HTTP Transport](#using-bifrost-http-transport)
168106
- [i) Using Go Binary](#i-using-go-binary)
169107
- [ii) OR Using Docker](#ii-or-using-docker)
170-
- [B. Using Bifrost as a Go Package](#b-using-bifrost-as-a-go-package)
171108
- [📑 Table of Contents](#-table-of-contents)
172-
- [🔍 Overview](#-overview)
173109
- [✨ Features](#-features)
174110
- [🏗️ Repository Structure](#️-repository-structure)
175111
- [🚀 Getting Started](#-getting-started)
176-
- [Package Structure](#package-structure)
177-
- [Additional Configurations](#additional-configurations)
112+
- [1. As a Go Package (Core Integration)](#1-as-a-go-package-core-integration)
113+
- [2. As an HTTP API (Transport Layer)](#2-as-an-http-api-transport-layer)
178114
- [📊 Benchmarks](#-benchmarks)
179115
- [Test Environment](#test-environment)
180116
- [1. t3.medium(2 vCPUs, 4GB RAM)](#1-t3medium2-vcpus-4gb-ram)
@@ -186,20 +122,6 @@ For additional HTTP server configuration options, read [this](https://github.com
186122
187123
---
188124
189-
## 🔍 Overview
190-
191-
Bifrost acts as a bridge between your applications and multiple AI providers (OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, etc.). It provides a consistent API while handling:
192-
193-
- Authentication and key management
194-
- Request routing and load balancing
195-
- Fallback mechanisms for reliability
196-
- Unified request and response formatting
197-
- Connection pooling and concurrency control
198-
199-
With Bifrost, you can focus on building your AI-powered applications without worrying about the underlying provider-specific implementations. It handles all the complexities of key and provider management, providing a fixed input and output format so you don't need to modify your codebase for different providers.
200-
201-
---
202-
203125
## ✨ Features
204126
205127
- **Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
@@ -212,6 +134,8 @@ With Bifrost, you can focus on building your AI-powered applications without wor
212134
- **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
213135
- **Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
214136
- **Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
137+
- **SDK Support**: Bifrost is available as a Go package, so you can use it directly in your own applications.
138+
- **Seamless Integration with Generative AI SDKs**: Effortlessly transition to Bifrost by simply updating the `base_url` in your existing SDKs, such as OpenAI, Anthropic, GenAI, and more. Just one line of code is all it takes to make the switch.
215139
216140
---
217141
@@ -233,7 +157,7 @@ bifrost/
233157
│ └── ...
234158
235159
├── transports/ # Interface layers (HTTP, gRPC, etc.)
236-
│ ├── bifrost-http/ # HTTP transport implementation
160+
│ ├── bifrost-http/ # HTTP transport implementation
237161
│ └── ...
238162
239163
└── plugins/ # Plugin Implementations
@@ -247,40 +171,34 @@ The system uses a provider-agnostic approach with well-defined interfaces to eas
247171

248172
## 🚀 Getting Started
249173

250-
If you want to **set up the Bifrost API quickly**, [check the transports documentation](https://github.com/maximhq/bifrost/tree/main/transports/README.md).
251-
252-
### Package Structure
174+
There are two main ways to use Bifrost:
253175

254-
Bifrost is divided into three Go packages: core, plugins, and transports.
176+
### 1. As a Go Package (Core Integration)
255177

256-
1. **core**: This package contains the core implementation of Bifrost as a Go package.
257-
2. **plugins**: This package serves as an extension to core. You can download individual packages using `go get github.com/maximhq/bifrost/plugins/{plugin-name}` and pass the plugins while initializing Bifrost.
178+
For direct integration into your Go applications, use Bifrost as a package. This provides the most flexibility and control over your AI model interactions.
258179

259-
```golang
260-
// go get github.com/maximhq/bifrost/plugins/maxim
180+
> **📖 [Complete Core Package Documentation](./docs/core-package.md)**
261181
262-
maximPlugin, err := maxim.NewMaximLoggerPlugin(os.Getenv("MAXIM_API_KEY"), os.Getenv("MAXIM_LOGGER_ID"))
263-
if err != nil {
264-
return nil, err
265-
}
182+
Quick example:
266183

267-
// Initialize Bifrost
268-
client, err := bifrost.Init(schemas.BifrostConfig{
269-
Account: &account,
270-
Plugins: []schemas.Plugin{maximPlugin},
271-
})
184+
```bash
185+
go get github.com/maximhq/bifrost/core
272186
```
273187

274-
3. **transports**: This package contains transport clients like HTTP to expose your Bifrost client. You can either `go get` this package or directly use the independent Dockerfile to quickly spin up your [Bifrost API](https://github.com/maximhq/bifrost/tree/main/transports/README.md) (read more on this).
188+
### 2. As an HTTP API (Transport Layer)
275189

276-
### Additional Configurations
190+
For quick setup and language-agnostic integration, use the HTTP transport layer.
277191

278-
- [Memory Management](https://github.com/maximhq/bifrost/blob/main/docs/memory-management.md)
279-
- [Logger](https://github.com/maximhq/bifrost/blob/main/docs/logger.md)
280-
- [Plugins](https://github.com/maximhq/bifrost/blob/main/docs/plugins.md)
281-
- [Provider Configurations](https://github.com/maximhq/bifrost/blob/main/docs/providers.md)
282-
- [Fallbacks](https://github.com/maximhq/bifrost/blob/main/docs/fallbacks.md)
283-
- [MCP Integration](https://github.com/maximhq/bifrost/blob/main/docs/mcp.md)
192+
> **📖 [Complete HTTP Transport Documentation](./transports/README.md)**
193+
194+
Quick example:
195+
196+
```bash
197+
docker run -p 8080:8080 \
198+
-v $(pwd)/config.json:/app/config/config.json \
199+
-e OPENAI_API_KEY \
200+
maximhq/bifrost
201+
```
284202

285203
---
286204

0 commit comments

Comments
 (0)