Skip to content

Commit a0b9f1d

Browse files
feat: added mcp to http transport (#100)
# Enhanced Configuration Structure and MCP Integration This PR introduces a more organized configuration structure and adds comprehensive Model Context Protocol (MCP) support for tool integration: 1. Restructured configuration format with a nested `providers` object for better organization 2. Added full MCP (Model Context Protocol) integration for external tool execution 3. Implemented a dedicated `/v1/mcp/tool/execute` endpoint for tool calls 4. Added detailed documentation for multi-turn conversations with MCP tools 5. Updated OpenAPI specification to include MCP tool execution endpoints 6. Improved logging for MCP client connections 7. Added weighted key distribution support in configuration examples 8. Updated `.gitignore` to exclude private files 9. Enhanced documentation with text formatting and code examples The new configuration structure provides a cleaner separation between provider settings and MCP configuration, while the MCP integration enables AI models to discover and use external tools through a standardized protocol.
1 parent 6820c37 commit a0b9f1d

File tree

14 files changed

+971
-248
lines changed

14 files changed

+971
-248
lines changed

.gitignore

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,5 @@
33
.DS_Store
44
*_creds*
55
**/venv/
6-
**/__pycache__/
76
**/__pycache__/**
8-
private.*
7+
private.*

README.md

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -18,14 +18,16 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
1818

1919
```json
2020
{
21-
"openai": {
22-
"keys": [
23-
{
24-
"value": "env.OPENAI_API_KEY",
25-
"models": ["gpt-4o-mini"],
26-
"weight": 1.0
27-
}
28-
]
21+
"providers": {
22+
"openai": {
23+
"keys": [
24+
{
25+
"value": "env.OPENAI_API_KEY",
26+
"models": ["gpt-4o-mini"],
27+
"weight": 1.0
28+
}
29+
]
30+
}
2931
}
3032
}
3133
```
@@ -110,6 +112,7 @@ For additional HTTP server configuration options, read [this](https://github.com
110112
{
111113
Value: os.Getenv("OPENAI_API_KEY"),
112114
Models: []string{"gpt-4o-mini"},
115+
Weight: 1.0,
113116
},
114117
}, nil
115118
}
@@ -201,12 +204,12 @@ With Bifrost, you can focus on building your AI-powered applications without wor
201204

202205
- **Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
203206
- **Fallback Mechanisms**: Automatically retry failed requests with alternative models or providers
204-
- **Dynamic Key Management**: Rotate and manage API keys efficiently
207+
- **Dynamic Key Management**: Rotate and manage API keys efficiently with weighted distribution
205208
- **Connection Pooling**: Optimize network resources for better performance
206209
- **Concurrency Control**: Manage rate limits and parallel requests effectively
207210
- **Flexible Transports**: Multiple transports for easy integration into your infra
208211
- **Plugin First Architecture**: No callback hell, simple addition/creation of custom plugins
209-
- **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration
212+
- **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
210213
- **Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
211214
- **Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
212215

core/mcp.go

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -486,6 +486,8 @@ func (m *MCPManager) connectToMCPClient(config schemas.MCPClientConfig) error {
486486
for toolName, tool := range tools {
487487
client.ToolMap[toolName] = tool
488488
}
489+
490+
m.logger.Info(fmt.Sprintf("%s Connected to MCP client: %s", MCPLogPrefix, config.Name))
489491
} else {
490492
return fmt.Errorf("client %s was removed during connection setup", config.Name)
491493
}

docs/http-transport-api.md

Lines changed: 85 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,11 @@ This document provides comprehensive API documentation for the Bifrost HTTP tran
44

55
## Base URL
66

7-
```
8-
http://localhost:8080
9-
```
7+
```text
8+
http://localhost:8080
9+
```
10+
11+
> 🔧 **MCP (Model Context Protocol) Integration**: Bifrost HTTP transport includes built-in MCP support for external tool integration. When MCP is configured, tools are automatically discovered and added to model requests. For comprehensive MCP setup and usage, see the [**MCP Integration Guide**](mcp.md) and [**HTTP Transport MCP Configuration**](../transports/README.md#mcp-model-context-protocol-configuration).
1012
1113
## OpenAPI Specification
1214

@@ -215,7 +217,86 @@ Creates a text completion from a prompt.
215217
}
216218
```
217219

218-
### 3. Metrics
220+
### 3. MCP Tool Execution
221+
222+
**POST** `/v1/mcp/tool/execute`
223+
224+
Executes MCP (Model Context Protocol) tools that have been configured in Bifrost. This endpoint is used to execute tool calls returned by AI models during conversations.
225+
226+
> **Note**: This endpoint requires MCP to be configured in Bifrost. See [MCP Integration Guide](mcp.md) for setup instructions.
227+
228+
#### Request Body
229+
230+
```json
231+
{
232+
"type": "function",
233+
"id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg",
234+
"function": {
235+
"name": "google_search",
236+
"arguments": "{\"gl\":\"us\",\"hl\":\"en\",\"num\":5,\"q\":\"San Francisco news yesterday\",\"tbs\":\"qdr:d\"}"
237+
}
238+
}
239+
```
240+
241+
#### Response
242+
243+
```json
244+
{
245+
"role": "tool",
246+
"content": "{\n \"searchParameters\": {\n \"q\": \"San Francisco news yesterday\",\n \"gl\": \"us\",\n \"hl\": \"en\",\n \"type\": \"search\",\n \"num\": 5,\n \"tbs\": \"qdr:d\",\n \"engine\": \"google\"\n },\n \"organic\": [\n {\n \"title\": \"San Francisco Chronicle · Giants' today\"\n },\n {\n \"query\": \"s.f. chronicle e edition\"\n }\n ],\n \"credits\": 1\n}",
247+
"tool_call_id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg"
248+
}
249+
```
250+
251+
#### Multi-Turn Tool Workflow
252+
253+
The typical workflow for using MCP tools involves:
254+
255+
1. **Send chat completion request** → AI responds with `tool_calls`
256+
2. **Execute tools via `/v1/mcp/tool/execute`** → Get tool result messages
257+
3. **Add tool results to conversation** → Send back for final response
258+
259+
```bash
260+
# Step 1: Chat completion (AI decides to use tools)
261+
curl -X POST http://localhost:8080/v1/chat/completions \
262+
-H "Content-Type: application/json" \
263+
-d '{
264+
"provider": "openai",
265+
"model": "gpt-4o-mini",
266+
"messages": [
267+
{"role": "user", "content": "Search for San Francisco news from yesterday"}
268+
]
269+
}'
270+
271+
# Step 2: Execute the tool call returned by AI
272+
curl -X POST http://localhost:8080/v1/mcp/tool/execute \
273+
-H "Content-Type: application/json" \
274+
-d '{
275+
"type": "function",
276+
"id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg",
277+
"function": {
278+
"name": "google_search",
279+
"arguments": "{\"q\":\"San Francisco news yesterday\"}"
280+
}
281+
}'
282+
283+
# Step 3: Continue conversation with tool results
284+
curl -X POST http://localhost:8080/v1/chat/completions \
285+
-H "Content-Type: application/json" \
286+
-d '{
287+
"provider": "openai",
288+
"model": "gpt-4o-mini",
289+
"messages": [
290+
{"role": "user", "content": "Search for San Francisco news from yesterday"},
291+
{"role": "assistant", "tool_calls": [...]},
292+
{"role": "tool", "content": "...", "tool_call_id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg"}
293+
]
294+
}'
295+
```
296+
297+
For detailed MCP setup and multi-turn conversation examples, see [Multi-Turn Conversations with MCP Tools](../transports/README.md#multi-turn-conversations-with-mcp-tools).
298+
299+
### 4. Metrics
219300

220301
**GET** `/metrics`
221302

docs/logger.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -37,16 +37,16 @@ client, err := bifrost.Init(schemas.BifrostConfig{
3737

3838
The default logger formats messages as:
3939

40-
```
41-
[BIFROST-TIMESTAMP] LEVEL: message
42-
[BIFROST-TIMESTAMP] ERROR: (error: error_message)
40+
```text
41+
[BIFROST-TIMESTAMP] LEVEL: message
42+
[BIFROST-TIMESTAMP] ERROR: (error: error_message)
4343
```
4444

4545
Example outputs:
4646

47-
```
48-
[BIFROST-2024-03-20T10:15:30Z] INFO: Initializing provider OpenAI
49-
[BIFROST-2024-03-20T10:15:31Z] ERROR: (error: failed to connect to provider)
47+
```text
48+
[BIFROST-2024-03-20T10:15:30Z] INFO: Initializing provider OpenAI
49+
[BIFROST-2024-03-20T10:15:31Z] ERROR: (error: failed to connect to provider)
5050
```
5151

5252
## 3. Implementing a Custom Logger

0 commit comments

Comments
 (0)