Skip to content

feat: add sub-agent MCP server with OpenAI chat completion support#19

Merged
mudler merged 3 commits intomudler:masterfrom
localai-bot:task_1772435506
Mar 2, 2026
Merged

feat: add sub-agent MCP server with OpenAI chat completion support#19
mudler merged 3 commits intomudler:masterfrom
localai-bot:task_1772435506

Conversation

@localai-bot
Copy link
Contributor

@localai-bot localai-bot commented Mar 2, 2026

This PR adds a new MCP server called sub-agent that allows sending chat completion messages to any OpenAI-compatible endpoint.

Features

  • Send chat completion requests via sub_agent_send tool
  • Background job tracking with goroutines
  • In-memory storage with configurable TTL
  • List active jobs with sub_agent_list tool
  • Retrieve results with sub_agent_get_result tool

Configuration

Environment variables:

  • OPENAI_BASE_URL: The base URL for the OpenAI API endpoint
  • OPENAI_MODEL: The model to use (default: gpt-3.5-turbo)
  • OPENAI_API_KEY: API key for authentication
  • TTL: Time-to-live for stored results (default: 1h)

Acceptance Criteria

  • New directory sub-agent/ exists with complete implementation
  • Server can be configured via environment variables
  • OpenAI chat completions work using github.com/sashabaranov/go-openai
  • Background job tracking with goroutines and in-memory storage
  • TTL mechanism properly expires old results
  • MCP tools are properly registered and functional
  • Documentation (README.md) explains usage and configuration

Supersedes #18

- Add new sub-agent directory with main.go and README.md
- Implement sub_agent_send tool for chat completions
- Implement sub_agent_list tool to list active jobs
- Implement sub_agent_get_result tool to retrieve results
- Support background job tracking with goroutines
- In-memory storage with configurable TTL
- Add go-openai dependency for OpenAI API calls
@@ -0,0 +1,133 @@
# sub-agent MCP Server

A Model Context Protocol (MCP) server that allows sending chat completion messages to any OpenAI-compatible endpoint, with support for background job tracking using goroutines and in-memory storage with TTL.
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update the top-level README.md

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you have committed the binary as well with the code. Remove that

Copy link
Owner

@mudler mudler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see my comments

@localai-bot
Copy link
Contributor Author

Removed the binary file from the repository as requested. The binary should not be committed to the repo.

Copy link
Owner

@mudler mudler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update the top-level README with this MCP server

@localai-bot
Copy link
Contributor Author

Pushed updates based on feedback. Added sub-agent documentation to the top-level README.md as requested. Returning to IN_REVIEW.

@mudler mudler merged commit d56e966 into mudler:master Mar 2, 2026
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants