Skip to content

Commit 7ab7659

Browse files
feat: mcp added to http transport
1 parent 4161031 commit 7ab7659

File tree

16 files changed

+1103
-261
lines changed

16 files changed

+1103
-261
lines changed

.gitignore

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,5 +3,5 @@
33
.DS_Store
44
*_creds*
55
**/venv/
6-
**/__pycache__/
7-
**/__pycache__/**
6+
**/__pycache__/**
7+
private.*

README.md

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -18,14 +18,16 @@ Bifrost is an open-source middleware that serves as a unified gateway to various
1818

1919
```json
2020
{
21-
"openai": {
22-
"keys": [
23-
{
24-
"value": "env.OPENAI_API_KEY",
25-
"models": ["gpt-4o-mini"],
26-
"weight": 1.0
27-
}
28-
]
21+
"providers": {
22+
"openai": {
23+
"keys": [
24+
{
25+
"value": "env.OPENAI_API_KEY",
26+
"models": ["gpt-4o-mini"],
27+
"weight": 1.0
28+
}
29+
]
30+
}
2931
}
3032
}
3133
```
@@ -116,6 +118,7 @@ For additional configurations in HTTP server setup, please read [this](https://g
116118
{
117119
Value: os.Getenv("OPENAI_API_KEY"),
118120
Models: []string{"gpt-4o-mini"},
121+
Weight: 1.0,
119122
},
120123
}, nil
121124
}
@@ -205,12 +208,12 @@ With Bifrost, you can focus on building your AI-powered applications without wor
205208

206209
- **Multi-Provider Support**: Integrate with OpenAI, Anthropic, Amazon Bedrock, Mistral, Ollama, and more through a single API
207210
- **Fallback Mechanisms**: Automatically retry failed requests with alternative models or providers
208-
- **Dynamic Key Management**: Rotate and manage API keys efficiently
211+
- **Dynamic Key Management**: Rotate and manage API keys efficiently with weighted distribution
209212
- **Connection Pooling**: Optimize network resources for better performance
210213
- **Concurrency Control**: Manage rate limits and parallel requests effectively
211214
- **Flexible Transports**: Multiple transports for easy integration into your infra
212215
- **Plugin First Architecture**: No callback hell, simple addition/creation of custom plugins
213-
- **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration
216+
- **MCP Integration**: Built-in Model Context Protocol (MCP) support for external tool integration and execution
214217
- **Custom Configuration**: Offers granular control over pool sizes, network retry settings, fallback providers, and network proxy configurations
215218
- **Built-in Observability**: Native Prometheus metrics out of the box, no wrappers, no sidecars, just drop it in and scrape
216219

core/mcp.go

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -486,6 +486,8 @@ func (m *MCPManager) connectToMCPClient(config schemas.MCPClientConfig) error {
486486
for toolName, tool := range tools {
487487
client.ToolMap[toolName] = tool
488488
}
489+
490+
m.logger.Info(fmt.Sprintf("%s Connected to MCP client: %s", MCPLogPrefix, config.Name))
489491
} else {
490492
return fmt.Errorf("client %s was removed during connection setup", config.Name)
491493
}

docs/http-transport-api.md

Lines changed: 84 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,11 @@ This document provides comprehensive API documentation for the Bifrost HTTP tran
44

55
## Base URL
66

7+
```text
8+
http://localhost:8080
79
```
8-
http://localhost:8080
9-
```
10+
11+
> 🔧 **MCP (Model Context Protocol) Integration**: Bifrost HTTP transport includes built-in MCP support for external tool integration. When MCP is configured, tools are automatically discovered and added to model requests. For comprehensive MCP setup and usage, see the [**MCP Integration Guide**](mcp.md) and [**HTTP Transport MCP Configuration**](../transports/README.md#mcp-model-context-protocol-configuration).
1012
1113
## OpenAPI Specification
1214

@@ -215,7 +217,86 @@ Creates a text completion from a prompt.
215217
}
216218
```
217219

218-
### 3. Metrics
220+
### 3. MCP Tool Execution
221+
222+
**POST** `/v1/mcp/tool/execute`
223+
224+
Executes MCP (Model Context Protocol) tools that have been configured in Bifrost. This endpoint is used to execute tool calls returned by AI models during conversations.
225+
226+
> **Note**: This endpoint requires MCP to be configured in Bifrost. See [MCP Integration Guide](mcp.md) for setup instructions.
227+
228+
#### Request Body
229+
230+
```json
231+
{
232+
"type": "function",
233+
"id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg",
234+
"function": {
235+
"name": "google_search",
236+
"arguments": "{\"gl\":\"us\",\"hl\":\"en\",\"num\":5,\"q\":\"San Francisco news yesterday\",\"tbs\":\"qdr:d\"}"
237+
}
238+
}
239+
```
240+
241+
#### Response
242+
243+
```json
244+
{
245+
"role": "tool",
246+
"content": "{\n \"searchParameters\": {\n \"q\": \"San Francisco news yesterday\",\n \"gl\": \"us\",\n \"hl\": \"en\",\n \"type\": \"search\",\n \"num\": 5,\n \"tbs\": \"qdr:d\",\n \"engine\": \"google\"\n },\n \"organic\": [\n {\n \"title\": \"San Francisco Chronicle · Giants' today\"\n },\n {\n \"query\": \"s.f. chronicle e edition\"\n }\n ],\n \"credits\": 1\n}",
247+
"tool_call_id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg"
248+
}
249+
```
250+
251+
#### Multi-Turn Tool Workflow
252+
253+
The typical workflow for using MCP tools involves:
254+
255+
1. **Send chat completion request** → AI responds with `tool_calls`
256+
2. **Execute tools via `/v1/mcp/tool/execute`** → Get tool result messages
257+
3. **Add tool results to conversation** → Send back for final response
258+
259+
```bash
260+
# Step 1: Chat completion (AI decides to use tools)
261+
curl -X POST http://localhost:8080/v1/chat/completions \
262+
-H "Content-Type: application/json" \
263+
-d '{
264+
"provider": "openai",
265+
"model": "gpt-4o-mini",
266+
"messages": [
267+
{"role": "user", "content": "Search for San Francisco news from yesterday"}
268+
]
269+
}'
270+
271+
# Step 2: Execute the tool call returned by AI
272+
curl -X POST http://localhost:8080/v1/mcp/tool/execute \
273+
-H "Content-Type: application/json" \
274+
-d '{
275+
"type": "function",
276+
"id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg",
277+
"function": {
278+
"name": "google_search",
279+
"arguments": "{\"q\":\"San Francisco news yesterday\"}"
280+
}
281+
}'
282+
283+
# Step 3: Continue conversation with tool results
284+
curl -X POST http://localhost:8080/v1/chat/completions \
285+
-H "Content-Type: application/json" \
286+
-d '{
287+
"provider": "openai",
288+
"model": "gpt-4o-mini",
289+
"messages": [
290+
{"role": "user", "content": "Search for San Francisco news from yesterday"},
291+
{"role": "assistant", "tool_calls": [...]},
292+
{"role": "tool", "content": "...", "tool_call_id": "toolu_01Vmq4gaU6tSy7ZRKVC7U2fg"}
293+
]
294+
}'
295+
```
296+
297+
For detailed MCP setup and multi-turn conversation examples, see [Multi-Turn Conversations with MCP Tools](../transports/README.md#multi-turn-conversations-with-mcp-tools).
298+
299+
### 4. Metrics
219300

220301
**GET** `/metrics`
221302

docs/logger.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -37,16 +37,16 @@ client, err := bifrost.Init(schemas.BifrostConfig{
3737

3838
The default logger formats messages as:
3939

40-
```
41-
[BIFROST-TIMESTAMP] LEVEL: message
42-
[BIFROST-TIMESTAMP] ERROR: (error: error_message)
40+
```text
41+
[BIFROST-TIMESTAMP] LEVEL: message
42+
[BIFROST-TIMESTAMP] ERROR: (error: error_message)
4343
```
4444

4545
Example outputs:
4646

47-
```
48-
[BIFROST-2024-03-20T10:15:30Z] INFO: Initializing provider OpenAI
49-
[BIFROST-2024-03-20T10:15:31Z] ERROR: (error: failed to connect to provider)
47+
```text
48+
[BIFROST-2024-03-20T10:15:30Z] INFO: Initializing provider OpenAI
49+
[BIFROST-2024-03-20T10:15:31Z] ERROR: (error: failed to connect to provider)
5050
```
5151

5252
## 3. Implementing a Custom Logger

0 commit comments

Comments
 (0)