Skip to content

Commit 7130a77

Browse files
committed
Initialize Sequential Thinking MCP Server with core functionality, README, and configuration files
0 parents  commit 7130a77

File tree

6 files changed

+333
-0
lines changed

6 files changed

+333
-0
lines changed

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
.venv
2+
__pycache__
3+
*.pyc

LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) <2025> <Arben Ademi>
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

README.md

Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
# Sequential Thinking MCP Server 🧠
2+
3+
A powerful Model Context Protocol (MCP) server that helps break down complex problems into clear, sequential steps. This tool enhances structured problem-solving by managing thought sequences, allowing revisions, and supporting multiple solution paths.
4+
5+
## 🌟 Key Features
6+
7+
- **Sequential Problem Solving**: Break down complex problems step-by-step
8+
- **Full MCP Integration**: Seamless integration with Claude Desktop
9+
10+
## 🚀 Getting Started
11+
12+
### System Requirements
13+
- Python 3.10+
14+
- UV package manager (preferred) or pip
15+
- Claude Desktop application
16+
17+
### Step-by-Step Installation
18+
19+
1. **Set Up Environment**
20+
```bash
21+
# Create and activate virtual environment
22+
uv venv
23+
.venv\Scripts\activate # Windows
24+
source .venv/bin/activate # Unix/Mac
25+
```
26+
27+
2. **Install Package**
28+
```bash
29+
uv venv
30+
.venv\Scripts\activate
31+
uv pip install -e .
32+
```
33+
34+
3. **Launch Server**
35+
```bash
36+
mcp-sequential-thinking
37+
```
38+
39+
4. **Configure Claude Desktop**
40+
Add to `claude_desktop_config.json`:
41+
```json
42+
{
43+
"mcpServers": {
44+
"sequential-thinking": {
45+
"command": "C:\\path\\to\\your\\.venv\\Scripts\\mcp-sequential-thinking.exe"
46+
}
47+
}
48+
}
49+
```
50+
51+
## 🛠️ Core Parameters
52+
53+
| Parameter | Description | Required |
54+
|-----------|-------------|----------|
55+
| `thought` | Current thinking step | Yes |
56+
| `thought_number` | Step sequence number | Yes |
57+
| `total_thoughts` | Estimated steps needed | Yes |
58+
| `next_thought_needed` | Indicates if more steps required | Yes |
59+
| `is_revision` | Marks thought revision | No |
60+
| `revises_thought` | Identifies thought being revised | No |
61+
| `branch_from_thought` | Starting point for new branch | No |
62+
| `branch_id` | Unique branch identifier | No |
63+
64+
## 🔄 Response Format
65+
66+
The server returns JSON with:
67+
- `thoughtNumber`: Current step number
68+
- `totalThoughts`: Total steps estimated
69+
- `nextThoughtNeeded`: Whether more steps needed
70+
- `branches`: List of active branch IDs
71+
- `thoughtHistoryLength`: Total thoughts recorded
72+
73+
## 👩‍💻 Development Setup
74+
75+
1. Clone repository
76+
2. Create development environment:
77+
```bash
78+
uv venv
79+
source .venv/bin/activate # or `.venv\Scripts\activate` on Windows
80+
uv pip install -e .
81+
```
82+
83+
## 📝 Example Usage
84+
85+
To use with Claude:
86+
1. Start the server
87+
2. In Claude, begin with: "Use sequential thinking to solve this problem..."
88+
3. Enjoy
89+
90+
## 🤝 Contributing & Support
91+
92+
- Submit issues for bugs or suggestions
93+
- Pull requests are welcome
94+
- Follow coding standards in CONTRIBUTING.md
95+
96+
## 📄 License
97+
98+
This project is licensed under the [MIT License](LICENSE).

mcp_sequential_thinking/__init__.py

Whitespace-only changes.

mcp_sequential_thinking/server.py

Lines changed: 183 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,183 @@
1+
from dataclasses import dataclass
2+
from typing import Dict, List, Optional, Any
3+
import json
4+
from mcp.server.fastmcp import FastMCP
5+
from rich.console import Console
6+
from rich.panel import Panel
7+
from rich.text import Text
8+
9+
console = Console(stderr=True)
10+
11+
@dataclass
12+
class ThoughtData:
13+
thought: str
14+
thought_number: int
15+
total_thoughts: int
16+
next_thought_needed: bool
17+
is_revision: Optional[bool] = None
18+
revises_thought: Optional[int] = None
19+
branch_from_thought: Optional[int] = None
20+
branch_id: Optional[str] = None
21+
needs_more_thoughts: Optional[bool] = None
22+
23+
class SequentialThinkingServer:
24+
def __init__(self):
25+
self.thought_history: List[ThoughtData] = []
26+
self.branches: Dict[str, List[ThoughtData]] = {}
27+
28+
def _validate_thought_data(self, input_data: dict) -> ThoughtData:
29+
"""Validate and convert input dictionary to ThoughtData."""
30+
required_fields = {
31+
"thought": str,
32+
"thoughtNumber": int,
33+
"totalThoughts": int,
34+
"nextThoughtNeeded": bool
35+
}
36+
37+
for field, field_type in required_fields.items():
38+
if field not in input_data:
39+
raise ValueError(f"Missing required field: {field}")
40+
if not isinstance(input_data[field], field_type):
41+
raise ValueError(f"Invalid type for {field}: expected {field_type}")
42+
43+
return ThoughtData(
44+
thought=input_data["thought"],
45+
thought_number=input_data["thoughtNumber"],
46+
total_thoughts=input_data["totalThoughts"],
47+
next_thought_needed=input_data["nextThoughtNeeded"],
48+
is_revision=input_data.get("isRevision"),
49+
revises_thought=input_data.get("revisesThought"),
50+
branch_from_thought=input_data.get("branchFromThought"),
51+
branch_id=input_data.get("branchId"),
52+
needs_more_thoughts=input_data.get("needsMoreThoughts")
53+
)
54+
55+
def _format_thought(self, thought_data: ThoughtData) -> Panel:
56+
"""Format a thought into a rich Panel with appropriate styling."""
57+
if thought_data.is_revision:
58+
prefix = "🔄 Revision"
59+
context = f" (revising thought {thought_data.revises_thought})"
60+
style = "yellow"
61+
elif thought_data.branch_from_thought:
62+
prefix = "🌿 Branch"
63+
context = f" (from thought {thought_data.branch_from_thought}, ID: {thought_data.branch_id})"
64+
style = "green"
65+
else:
66+
prefix = "💭 Thought"
67+
context = ""
68+
style = "blue"
69+
70+
header = Text(f"{prefix} {thought_data.thought_number}/{thought_data.total_thoughts}{context}", style=style)
71+
content = Text(thought_data.thought)
72+
73+
return Panel.fit(
74+
content,
75+
title=header,
76+
border_style=style,
77+
padding=(1, 2)
78+
)
79+
80+
def process_thought(self, input_data: Any) -> dict:
81+
"""Process a thought and return formatted response."""
82+
try:
83+
thought_data = self._validate_thought_data(input_data)
84+
85+
# Adjust total thoughts if needed
86+
if thought_data.thought_number > thought_data.total_thoughts:
87+
thought_data.total_thoughts = thought_data.thought_number
88+
89+
# Store thought in history
90+
self.thought_history.append(thought_data)
91+
92+
# Handle branching
93+
if thought_data.branch_from_thought and thought_data.branch_id:
94+
if thought_data.branch_id not in self.branches:
95+
self.branches[thought_data.branch_id] = []
96+
self.branches[thought_data.branch_id].append(thought_data)
97+
98+
# Display formatted thought
99+
console.print(self._format_thought(thought_data))
100+
101+
return {
102+
"content": [{
103+
"type": "text",
104+
"text": json.dumps({
105+
"thoughtNumber": thought_data.thought_number,
106+
"totalThoughts": thought_data.total_thoughts,
107+
"nextThoughtNeeded": thought_data.next_thought_needed,
108+
"branches": list(self.branches.keys()),
109+
"thoughtHistoryLength": len(self.thought_history)
110+
}, indent=2)
111+
}]
112+
}
113+
114+
except Exception as e:
115+
return {
116+
"content": [{
117+
"type": "text",
118+
"text": json.dumps({
119+
"error": str(e),
120+
"status": "failed"
121+
}, indent=2)
122+
}],
123+
"isError": True
124+
}
125+
126+
def create_server() -> FastMCP:
127+
"""Create and configure the MCP server."""
128+
mcp = FastMCP("sequential-thinking")
129+
thinking_server = SequentialThinkingServer()
130+
131+
@mcp.tool()
132+
async def sequential_thinking(
133+
thought: str,
134+
thought_number: int,
135+
total_thoughts: int,
136+
next_thought_needed: bool,
137+
is_revision: Optional[bool] = None,
138+
revises_thought: Optional[int] = None,
139+
branch_from_thought: Optional[int] = None,
140+
branch_id: Optional[str] = None,
141+
needs_more_thoughts: Optional[bool] = None
142+
) -> str:
143+
"""A detailed tool for dynamic and reflective problem-solving through thoughts.
144+
145+
This tool helps analyze problems through a flexible thinking process that can adapt and evolve.
146+
Each thought can build on, question, or revise previous insights as understanding deepens.
147+
148+
Args:
149+
thought: Your current thinking step
150+
thought_number: Current thought number in sequence
151+
total_thoughts: Current estimate of thoughts needed
152+
next_thought_needed: Whether another thought step is needed
153+
is_revision: Whether this revises previous thinking
154+
revises_thought: Which thought is being reconsidered
155+
branch_from_thought: Branching point thought number
156+
branch_id: Branch identifier
157+
needs_more_thoughts: If more thoughts are needed
158+
"""
159+
input_data = {
160+
"thought": thought,
161+
"thoughtNumber": thought_number,
162+
"totalThoughts": total_thoughts,
163+
"nextThoughtNeeded": next_thought_needed,
164+
"isRevision": is_revision,
165+
"revisesThought": revises_thought,
166+
"branchFromThought": branch_from_thought,
167+
"branchId": branch_id,
168+
"needsMoreThoughts": needs_more_thoughts
169+
}
170+
171+
result = thinking_server.process_thought(input_data)
172+
return result["content"][0]["text"]
173+
174+
return mcp
175+
176+
def main():
177+
"""Main entry point for the sequential thinking server."""
178+
server = create_server()
179+
return server.run()
180+
181+
if __name__ == "__main__":
182+
server = create_server()
183+
server.run()

pyproject.toml

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
[project]
2+
name = "sequential-thinking"
3+
version = "0.2.0"
4+
description = "A Sequential Thinking MCP Server for advanced problem solving"
5+
readme = "README.md"
6+
requires-python = ">=3.10"
7+
license = { text = "MIT" }
8+
keywords = ["mcp", "ai", "problem-solving"]
9+
authors = [
10+
{ name = "Arben Ademi", email = "arben.ademi@tuta.io" }
11+
]
12+
dependencies = [
13+
"mcp[cli]>=1.2.0",
14+
"rich>=13.7.0",
15+
]
16+
17+
[project.scripts]
18+
mcp-sequential-thinking = "mcp_sequential_thinking.server:main"
19+
20+
[project.urls]
21+
Source = "https://github.com/arben-adm/sequential-thinking"
22+
23+
[tool.hatch.build.targets.wheel]
24+
packages = ["mcp_sequential_thinking"]
25+
26+
[build-system]
27+
requires = ["hatchling"]
28+
build-backend = "hatchling.build"

0 commit comments

Comments
 (0)