Skip to content

feat: Make maximum message size configurable #1019

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

pantanurag555
Copy link

@pantanurag555 pantanurag555 commented Jun 24, 2025

Motivation and Context

Currently the maximum message/payload size for HTTP requests has been hard coded to 4MB in MCP protocol. This setting should be configurable by the server owners based on the requirements of their specific server as noted by the community in #959 and #1012. This change would help MCP protocol facilitate transmission of base64 encoded images/videos between client and server for multi-modal workloads, that can easily go above 4MB in size.

The change in this PR will allow configuration of maximum message size in FastMCP servers as a server setting.
Resolves #959
Resolves #1012

How Has This Been Tested?

  • Added relevant unit tests
  • Created a FastMCP server with maximum_message_size set to 1 after compiling changes. Confirmed that POST requests are rejected with 413 Content Too Large

Breaking Changes

None

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation update

Checklist

  • I have read the MCP Documentation
  • My code follows the repository's style guidelines
  • New and existing tests pass locally
  • I have added appropriate error handling
  • I have added or updated documentation as needed

@pantanurag555 pantanurag555 force-pushed the configure-input-size branch from f498499 to 1afb342 Compare June 27, 2025 17:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant