Skip to content

Commit 5ad3df8

Browse files
feat: transport flags simplified and logs updated
1 parent 8991e0d commit 5ad3df8

File tree

16 files changed

+614
-309
lines changed

16 files changed

+614
-309
lines changed

README.md

Lines changed: 50 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -26,43 +26,47 @@ Bifrost is a high-performance AI gateway that connects you to 8+ providers (Open
2626

2727
📖 For detailed setup guides with multiple providers, advanced configuration, and language examples, see [Quick Start Documentation](./docs/quickstart/README.md)
2828

29-
**Step 1:** Create your config (copy & paste this)
30-
31-
```json
32-
{
33-
"providers": {
34-
"openai": {
35-
"keys": [
36-
{
37-
"value": "env.OPENAI_API_KEY",
38-
"models": ["gpt-4o-mini"],
39-
"weight": 1.0
40-
}
41-
]
42-
}
43-
}
44-
}
29+
**Step 1:** Start Bifrost (choose one)
30+
31+
```bash
32+
# 🐳 Docker (easiest - zero config needed!)
33+
docker pull maximhq/bifrost
34+
docker run -p 8080:8080 maximhq/bifrost
35+
36+
# 🔧 Or install Go binary (Make sure Go is in your PATH)
37+
go install github.com/maximhq/bifrost/transports/bifrost-http@latest
38+
bifrost-http -port 8080
4539
```
4640

47-
**Step 2:** Add your API key
41+
**Step 2:** Open the built-in web interface
4842

4943
```bash
50-
export OPENAI_API_KEY=your_openai_api_key
44+
# 🖥️ Configure visually - no config files needed!
45+
# macOS:
46+
open http://localhost:8080
47+
48+
# Linux:
49+
xdg-open http://localhost:8080
50+
51+
# Windows:
52+
start http://localhost:8080
53+
54+
# Or simply open http://localhost:8080 manually in your browser
5155
```
5256

53-
**Step 3:** Start Bifrost (choose one)
57+
**Step 3:** Add your provider via the web UI or API
5458

5559
```bash
56-
# 🐳 Docker
57-
docker pull maximhq/bifrost
58-
docker run -p 8080:8080 \
59-
-v $(pwd)/config.json:/app/config/config.json \
60-
-e OPENAI_API_KEY \
61-
maximhq/bifrost
60+
# Via Web UI: Just click "Add Provider" and enter your OpenAI API key
61+
# Or via API:
62+
curl -X POST http://localhost:8080/providers \
63+
-H "Content-Type: application/json" \
64+
-d '{
65+
"provider": "openai",
66+
"keys": [{"value": "env.OPENAI_API_KEY", "models": ["gpt-4o-mini"], "weight": 1.0}]
67+
}'
6268

63-
# 🔧 Or install Go binary (Make sure Go is in your PATH)
64-
go install github.com/maximhq/bifrost/transports/bifrost-http@latest
65-
bifrost-http -config config.json -port 8080
69+
# Make sure to set the environment variable OPENAI_API_KEY in bifrost's session, or pass it as a flag in Docker (docker run -e OPENAI_API_KEY maximhq/bifrost).
6670
```
6771

6872
**Step 4:** Test it works
@@ -81,11 +85,12 @@ curl -X POST http://localhost:8080/v1/chat/completions \
8185

8286
**🎉 Boom! You're done!**
8387

84-
Your AI gateway is now running and ready for production. You can:
88+
Your AI gateway is now running with a beautiful web interface. You can:
8589

86-
- Add more providers for automatic failover
87-
- Scale to thousands of requests per second
88-
- Drop this into existing OpenAI/Anthropic code with zero changes
90+
- **🖥️ Configure everything visually** - No more JSON files!
91+
- **📊 Monitor requests in real-time** - See logs, analytics, and metrics
92+
- **🔄 Add providers and MCP clients on-the-fly** - Scale and failover without restarts
93+
- **🚀 Drop into existing code** - Zero changes to your OpenAI/Anthropic apps
8994

9095
> **Want more?** See our [Complete Setup Guide](./docs/quickstart/http-transport.md) for multi-provider configuration, failover strategies, and production deployment.
9196
@@ -112,18 +117,19 @@ Your AI gateway is now running and ready for production. You can:
112117

113118
## ✨ Features
114119

115-
- **Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
116-
- **Fallback Mechanisms**: Automatically retry failed requests with alternative models or providers
117-
- **Dynamic Key Management**: Rotate and manage API keys efficiently with weighted distribution
118-
- **Connection Pooling**: Optimize network resources for better performance
119-
- **Concurrency Control**: Manage rate limits and parallel requests effectively
120-
- **Flexible Transports**: Multiple transports for easy integration into your infra
121-
- **Plugin First Architecture**: No callback hell, simple addition/creation of custom plugins
122-
- **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
123-
- **Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
124-
- **Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
125-
- **SDK Support**: Bifrost is available as a Go package, so you can use it directly in your own applications.
126-
- **Seamless Integration with Generative AI SDKs**: Effortlessly transition to Bifrost by simply updating the `base_url` in your existing SDKs, such as OpenAI, Anthropic, GenAI, and more. Just one line of code is all it takes to make the switch.
120+
- **🖥️ Built-in Web UI**: Visual configuration, real-time monitoring, and analytics dashboard - no config files needed
121+
- **🚀 Zero-Config Startup & Easy Integration**: Start immediately with dynamic provider configuration, or integrate existing SDKs by simply updating the `base_url` - one line of code to get running
122+
- **🔄 Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
123+
- **🛡️ Fallback Mechanisms**: Automatically retry failed requests with alternative models or providers
124+
- **🔑 Dynamic Key Management**: Rotate and manage API keys efficiently with weighted distribution
125+
- **⚡ Connection Pooling**: Optimize network resources for better performance
126+
- **🎯 Concurrency Control**: Manage rate limits and parallel requests effectively
127+
- **🔌 Flexible Transports**: Multiple transports for easy integration into your infra
128+
- **🏗️ Plugin First Architecture**: No callback hell, simple addition/creation of custom plugins
129+
- **🛠️ MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
130+
- **⚙️ Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
131+
- **📊 Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
132+
- **🔧 SDK Support**: Bifrost is available as a Go package, so you can use it directly in your own applications
127133

128134
---
129135

docs/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Choose your preferred way to use Bifrost:
1111
| **🔧 Go Package** | Direct integration, maximum control | 2 minutes | [📖 Go Package Guide](quickstart/go-package.md) |
1212
| **🌐 HTTP Transport** | Language-agnostic, microservices | 30 seconds | [📖 HTTP Transport Guide](quickstart/http-transport.md) |
1313

14-
**New to Bifrost?** Start with [⚡ Quick Start](quickstart/) to get running in under 30 seconds.
14+
**New to Bifrost?** Start with [⚡ Quick Start](quickstart/) to get running with zero configuration in under 30 seconds.
1515

1616
---
1717

docs/contributing/README.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,14 +13,17 @@ Ready to contribute? Here's your fastest path to making an impact:
1313
```bash
1414
# 1. Fork and clone
1515
git clone https://github.com/YOUR_USERNAME/bifrost.git
16-
cd bifrost
16+
cd bifrost/core # or bifrost/transports
1717

1818
# 2. Install dependencies
1919
go mod download
2020

2121
# 3. Verify setup
22-
go test ./core/...
23-
cd transports && go build -o bifrost-http
22+
cd ../tests/core-providers/
23+
go test -run TestOpenAI # or any provider you want to test
24+
25+
cd ../transports-integrations/
26+
# read the README.md file in the transports-integrations directory for testing instructions
2427

2528
# 4. You're ready! 🎉
2629
```

docs/media/cover.png

-22 Bytes
Loading

docs/quickstart/README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,11 +25,13 @@ Get up and running with Bifrost in under 30 seconds. Choose your preferred integ
2525

2626
## 🌐 **HTTP Transport** - Choose if you:
2727

28+
- ✅ Want a clean UI for configuration and monitoring
2829
- ✅ Use any programming language (Python, Node.js, etc.)
2930
- ✅ Want to keep AI logic separate from your application
3031
- ✅ Need a centralized AI gateway for multiple services
3132
- ✅ Prefer REST API integration patterns
3233
- ✅ Want drop-in compatibility with existing provider SDKs
34+
- ✅ Want to **add providers & MCP clients on-the-fly** without restarts
3335

3436
**[Start with HTTP Transport](http-transport.md)**
3537

0 commit comments

Comments
 (0)