You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+50-44Lines changed: 50 additions & 44 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,43 +26,47 @@ Bifrost is a high-performance AI gateway that connects you to 8+ providers (Open
26
26
27
27
📖 For detailed setup guides with multiple providers, advanced configuration, and language examples, see [Quick Start Documentation](./docs/quickstart/README.md)
28
28
29
-
**Step 1:** Create your config (copy & paste this)
30
-
31
-
```json
32
-
{
33
-
"providers": {
34
-
"openai": {
35
-
"keys": [
36
-
{
37
-
"value": "env.OPENAI_API_KEY",
38
-
"models": ["gpt-4o-mini"],
39
-
"weight": 1.0
40
-
}
41
-
]
42
-
}
43
-
}
44
-
}
29
+
**Step 1:** Start Bifrost (choose one)
30
+
31
+
```bash
32
+
# 🐳 Docker (easiest - zero config needed!)
33
+
docker pull maximhq/bifrost
34
+
docker run -p 8080:8080 maximhq/bifrost
35
+
36
+
# 🔧 Or install Go binary (Make sure Go is in your PATH)
37
+
go install github.com/maximhq/bifrost/transports/bifrost-http@latest
38
+
bifrost-http -port 8080
45
39
```
46
40
47
-
**Step 2:**Add your API key
41
+
**Step 2:**Open the built-in web interface
48
42
49
43
```bash
50
-
export OPENAI_API_KEY=your_openai_api_key
44
+
# 🖥️ Configure visually - no config files needed!
45
+
# macOS:
46
+
open http://localhost:8080
47
+
48
+
# Linux:
49
+
xdg-open http://localhost:8080
50
+
51
+
# Windows:
52
+
start http://localhost:8080
53
+
54
+
# Or simply open http://localhost:8080 manually in your browser
51
55
```
52
56
53
-
**Step 3:**Start Bifrost (choose one)
57
+
**Step 3:**Add your provider via the web UI or API
54
58
55
59
```bash
56
-
# 🐳 Docker
57
-
docker pull maximhq/bifrost
58
-
docker run -p 8080:8080 \
59
-
-v $(pwd)/config.json:/app/config/config.json \
60
-
-e OPENAI_API_KEY \
61
-
maximhq/bifrost
60
+
# Via Web UI: Just click "Add Provider" and enter your OpenAI API key
# 🔧 Or install Go binary (Make sure Go is in your PATH)
64
-
go install github.com/maximhq/bifrost/transports/bifrost-http@latest
65
-
bifrost-http -config config.json -port 8080
69
+
# Make sure to set the environment variable OPENAI_API_KEY in bifrost's session, or pass it as a flag in Docker (docker run -e OPENAI_API_KEY maximhq/bifrost).
66
70
```
67
71
68
72
**Step 4:** Test it works
@@ -81,11 +85,12 @@ curl -X POST http://localhost:8080/v1/chat/completions \
81
85
82
86
**🎉 Boom! You're done!**
83
87
84
-
Your AI gateway is now running and ready for production. You can:
88
+
Your AI gateway is now running with a beautiful web interface. You can:
85
89
86
-
- Add more providers for automatic failover
87
-
- Scale to thousands of requests per second
88
-
- Drop this into existing OpenAI/Anthropic code with zero changes
90
+
-**🖥️ Configure everything visually** - No more JSON files!
91
+
-**📊 Monitor requests in real-time** - See logs, analytics, and metrics
92
+
-**🔄 Add providers and MCP clients on-the-fly** - Scale and failover without restarts
93
+
-**🚀 Drop into existing code** - Zero changes to your OpenAI/Anthropic apps
89
94
90
95
> **Want more?** See our [Complete Setup Guide](./docs/quickstart/http-transport.md) for multi-provider configuration, failover strategies, and production deployment.
91
96
@@ -112,18 +117,19 @@ Your AI gateway is now running and ready for production. You can:
112
117
113
118
## ✨ Features
114
119
115
-
-**Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
116
-
-**Fallback Mechanisms**: Automatically retry failed requests with alternative models or providers
117
-
-**Dynamic Key Management**: Rotate and manage API keys efficiently with weighted distribution
118
-
-**Connection Pooling**: Optimize network resources for better performance
119
-
-**Concurrency Control**: Manage rate limits and parallel requests effectively
120
-
-**Flexible Transports**: Multiple transports for easy integration into your infra
121
-
-**Plugin First Architecture**: No callback hell, simple addition/creation of custom plugins
122
-
-**MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
123
-
-**Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
124
-
-**Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
125
-
-**SDK Support**: Bifrost is available as a Go package, so you can use it directly in your own applications.
126
-
-**Seamless Integration with Generative AI SDKs**: Effortlessly transition to Bifrost by simply updating the `base_url` in your existing SDKs, such as OpenAI, Anthropic, GenAI, and more. Just one line of code is all it takes to make the switch.
120
+
-**🖥️ Built-in Web UI**: Visual configuration, real-time monitoring, and analytics dashboard - no config files needed
121
+
-**🚀 Zero-Config Startup & Easy Integration**: Start immediately with dynamic provider configuration, or integrate existing SDKs by simply updating the `base_url` - one line of code to get running
122
+
-**🔄 Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
123
+
-**🛡️ Fallback Mechanisms**: Automatically retry failed requests with alternative models or providers
124
+
-**🔑 Dynamic Key Management**: Rotate and manage API keys efficiently with weighted distribution
125
+
-**⚡ Connection Pooling**: Optimize network resources for better performance
126
+
-**🎯 Concurrency Control**: Manage rate limits and parallel requests effectively
127
+
-**🔌 Flexible Transports**: Multiple transports for easy integration into your infra
128
+
-**🏗️ Plugin First Architecture**: No callback hell, simple addition/creation of custom plugins
129
+
-**🛠️ MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
130
+
-**⚙️ Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
131
+
-**📊 Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
132
+
-**🔧 SDK Support**: Bifrost is available as a Go package, so you can use it directly in your own applications
0 commit comments