A serverless ComfyUI deployment system that runs workflows on Modal and exposes them via MCP (Model Context Protocol) servers for easy integration with AI assistants.
Includes:
- Serverless ComfyUI: Deploy workflows with automatic scaling and GPU optimization
- Memory Snapshots: Fast cold starts using Modal's memory snapshot feature
- Workspace Management: Organize workflows, models, and custom nodes by workspace
- MCP Integration: Expose workflows as tools for AI assistants
- Parameter Injection: Dynamic workflow customization via YAML configuration
modal_comfy.py: Main deployment script creating Modal apps with ComfyUIcomfyclient.py: Client for testing deployed workflowsdeploy_constants.py: GPU, timeout, and resource configurationmcp/: MCP server implementation for AI assistant integration
# Set workspace and deploy to Modal
WORKSPACE=slow_new modal deploy modal_comfy.py# Test a specific workflow
python comfyclient.py \
--modal-workspace edenartlab \
--workspace_name slow-new-stage \
--workflow txt2img \
--test-json workspaces/slow_new/workflows/txt2img/test.json# Serve ComfyUI as interactive endpoint accessible through browser:
WORKSPACE=slow_new modal serve modal_comfy.pyThe MCP server exposes ComfyUI workflows as tools for AI assistants.
# Run MCP server locally
cd mcp/
python mcp_server.py
# Test with client
python mcp_client.py --local# Deploy to Modal
modal deploy mcp/mcp_server.py
# Test remote deployment
python mcp_client.py --remote https://your-modal-url/mcpAdd to your Claude Desktop config:
{
"mcpServers": {
"comfyui": {
"command": "python",
"args": ["/path/to/mcp_server.py"],
"env": {
"MODAL_WORKSPACE": "your-workspace",
"WORKSPACE_NAME": "your-workspace-name"
}
}
}
}- fix custom_node installations (comfy-cli vs git clone ... )
- correctly parse all workflow outputs and stream them to client
- implement all custom arg injection types (folder, array, ...)