Filesystem MCP server optimized for AI assistants — Reduce context window tokens by 62.7% while keeping full functionality. Compatible with Claude, ChatGPT, Gemini, Cursor, and all MCP clients.
A token-optimized version of the Filesystem Model Context Protocol (MCP) server.
MCP tool schemas consume significant context window tokens. When AI assistants like Claude or ChatGPT load MCP tools, each tool definition takes up valuable context space.
The original @modelcontextprotocol/server-filesystem loads 14 tools consuming approximately ~10,563 tokens — that's space you could use for actual conversation.
filesystem-slim intelligently groups 14 tools into 6 semantic operations, reducing token usage by 62.7% — with zero functionality loss.
Your AI assistant sees fewer, smarter tools. Every original capability remains available.
| Metric | Original | Slim | Reduction |
|---|---|---|---|
| Tools | 14 | 6 | -54% |
| Schema Tokens | 2,583 | 518 | 79.9% |
| Claude Code (est.) | ~10,563 | ~3,938 | ~62.7% |
Benchmark Info
- Original:
@modelcontextprotocol/server-filesystem@2025.12.18- Schema tokens measured with tiktoken (cl100k_base)
- Claude Code estimate includes ~570 tokens/tool overhead
# Claude Desktop - auto-configure
npx filesystem-slim --setup claude
# Cursor - auto-configure
npx filesystem-slim --setup cursor
# Interactive mode (choose your client)
npx filesystem-slim --setupDone! Restart your app to use filesystem.
# Claude Code (creates .mcp.json in project root)
claude mcp add filesystem -s project -- npx -y filesystem-slim@latest
# Windows: use cmd /c wrapper
claude mcp add filesystem -s project -- cmd /c npx -y filesystem-slim@latest
# VS Code (Copilot, Cline, Roo Code)
code --add-mcp '{"name":"filesystem","command":"npx","args":["-y","filesystem-slim@latest"]}'Click to expand manual configuration options
Add to your claude_desktop_config.json:
| OS | Path |
|---|---|
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "filesystem-slim@latest"]
}
}
}Add to .cursor/mcp.json (global) or <project>/.cursor/mcp.json (project):
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "filesystem-slim@latest"]
}
}
}MCPSlim acts as a transparent bridge between AI models and the original MCP server:
┌─────────────────────────────────────────────────────────────────┐
│ Without MCPSlim │
│ │
│ [AI Model] ──── reads 14 tool schemas ────→ [Original MCP] │
│ (~10,563 tokens loaded into context) │
├─────────────────────────────────────────────────────────────────┤
│ With MCPSlim │
│ │
│ [AI Model] ───→ [MCPSlim Bridge] ───→ [Original MCP] │
│ │ │ │ │
│ Sees 6 grouped Translates to Executes actual │
│ tools only original call tool & returns │
│ (~3,938 tokens) │
└─────────────────────────────────────────────────────────────────┘
- AI reads slim schema — Only 6 grouped tools instead of 14
- AI calls grouped tool — e.g.,
interaction({ action: "click", ... }) - MCPSlim translates — Converts to original:
browser_click({ ... }) - Original MCP executes — Real server processes the request
- Response returned — Result passes back unchanged
Zero functionality loss. 62.7% token savings.
| Group | Actions |
|---|---|
create |
2 |
delete |
2 |
file |
3 |
move |
2 |
query |
2 |
read |
2 |
- ✅ Full functionality — All original
@modelcontextprotocol/server-filesystemfeatures preserved - ✅ All AI assistants — Works with Claude, ChatGPT, Gemini, Copilot, and any MCP client
- ✅ Drop-in replacement — Same capabilities, just use grouped action names
- ✅ Tested — Schema compatibility verified via automated tests
No. Every original tool is accessible. Tools are grouped semantically (e.g., click, hover, drag → interaction), but all actions remain available via the action parameter.
AI models have limited context windows. MCP tool schemas consume tokens that could be used for conversation, code, or documents. Reducing tool schema size means more room for actual work.
MCPSlim is a community project. It wraps official MCP servers transparently — the original server does all the real work.
MIT
Powered by MCPSlim — MCP Token Optimizer
Reduce AI context usage. Keep full functionality.