Skip to content

MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well

License

Notifications You must be signed in to change notification settings

zinja-coder/zin-mcp-client

ZIN-MCP-CLIENT (Part of Zin's MCP Suite)

⚑ Lightweight, Fast, Simple, CLI-Based MCP Client for STDIO MCP Servers, to fill the gap and provide bridge between your local LLMs running Ollama and MCP Servers.

GitHub contributors JADX-AI-MCP GitHub contributors JADX-MCP-SERVER GitHub all releases GitHub release (latest by SemVer) Latest release Python 3.10+ License

banner

Image generated using AI tools.


πŸ€– What is ZIN MCP Client?

A powerful yet lightweight, fast, simple and flexible client for interacting with MCP (Model Context Protocol) servers through local LLMs. This tool allows you to connect to and utilize various tools from multiple MCP servers through an easy-to-use command-line interface.

Watch the demos!

  • Rich CLI interaction
zin-mcp-client-1.mp4
  • Perform Code Review to Find Vulnerabilities locally
zin-mcp-client-2.mp4

Features

  • Multi-Server Support: Connect to multiple MCP servers simultaneously
  • Local LLM Integration: Use local LLMs via Ollama for privacy and control
  • Interactive CLI: Clean, intuitive command-line interface with rich formatting
  • Comprehensive Logging: Detailed logs for debugging and troubleshooting
  • ReAct Agent Framework: Utilizes LangChain's ReAct agent pattern to intelligently invoke tools
  • Cross Platform: Cross Platform Support
  • Simple, Fast, Lightweight: It is simple, it is fast, it is lightweight

Check the Zin MCP Suite

Note

This project is mainly developed for Zin MCP Servers which are below mentioned MCP Servers, while other STDIO based MCP Servers should work in theory testing is done on following servers only.

πŸ› οΈ Getting Started

# 1. 
unzip zin-mcp-client-<version>.zip

β”œzin-mcp-client/
  β”œβ”€β”€ zin_mcp_client.py
  β”œβ”€β”€ src/
  β”œβ”€β”€ mcp-config.json
  β”œβ”€β”€ README.md
  β”œβ”€β”€ LICENSE

# 2. Navigate to zin-mcp-client directory
cd zin-mcp-client

# 3. This project uses uv (recommended) - https://github.com/astral-sh/uv instead of pip for dependency management.
    ## a. Install uv (if you dont have it yet) - (Only Required Step)
curl -LsSf https://astral.sh/uv/install.sh | sh

# All below steps are not required.
    ## b. OPTIONAL, if for any reasons, you get dependecy errors in jadx-mcp-server, Set up the environment
uv venv
source .venv/bin/activate  # or .venv\Scripts\activate on Windows
    ## c. OPTIONAL Install dependencies
uv pip install -r requirements.txt

# 4. Not recommended, you can also use pip for this.
pip install -r requirements.txt
or
pip install -r requirements.txt --break-system-packages

# The setup for zin-mcp-client is done.

πŸ€– 2. Ollama Setup

Β  ollama
1. Download and Install ollama: https://ollama.com/download

If you are on linux you can directly run below command to install it:

> curl -fsSL https://ollama.com/install.sh | sh

2. Download and run any LLM that has capability to invoke tool.

For example, the llama 3.1 has capability to invoke the tool. 

You can run it using following command:

> ollama run llama3.1:8b

[Note]: Kindly note the above command will fetch the model with 4b parameters. If you have stronger hardware kindly fetch higher parameter model for better performance.

3. Serve the Ollama on API server using following command

> ollama serve

This will serve the ollama api on port 1134, you can confirm that it running using `curl` command as following:

> curl http://localhost:11434/                                                                                                                                              18:54:00
`Ollama is running`

βš™οΈ 3. Config File Setup

The config file is the MCP Server configuration file that tells zin mcp client how to start the MCP Servers.

It follows the same style as Claude's configuration file.

Below is the sample configuration file for Zin MCP Suite Servers:

{
    "mcpServers": {
        "jadx-mcp-server": {
            "command": "/path/to/uv", 
            "args": [
                "--directory",
                "/path/to/jadx-mcp-server/",
                "run",
                "jadx_mcp_server.py"
            ]
        },
        "apktool-mcp-server": {
		    "command": "/path/to/uv",
            "args": [
                "--directory",
                "/path/to/apktool-mcp-server/",
                "run",
                "apktool_mcp_server.py"
            ]
	}
    }
}

Replace:

  • path/to/uv with the actual path to your uv executable
  • path/to/jadx-mcp-server with the absolute path to where you have stored the jadx-mcp-server

Note

The default location of config file is inside zin-mcp-client directory named mcp-config.json, however you can provide path to your own config file using --config option such as

uv run zin_mcp_client.py --server jadx-mcp-server --model llama3.1:8b --config /home/zinjacoder/mcp-config.json 

Give it a shot

  1. Run zin_mcp_client.py
uv run zin_mcp_client.py
  1. Use --server option to specify server of your choice, use --config option to provide path to your config file, use --model option to use specific model, use --debug to enable verbose output

If something went wrong - DEBUG and Troubleshooting

Note

For low spec systems, use only one server at a time to avoid LLM hallucinations.

  1. Look the logs:
  • All the logs and debug information and raw traffic and interactions are stored in logs in easy to read way, If anything goes wrong check the logs.
  1. Debug Mode:
  • You can also use the --debug flag to enable debug to print each and every detail on console on runtime to help you find the issue.
zin-mcp-client-debug.mp4
  1. Open Issue:

Current State Of Local LLM and MCPs:

Currently, proprietary API-based models like Anthropic’s Claude tend to be more proficient proficient at tool calling.

However, the open-source world is advancing rapidly! Models specifically fine-tuned on function calling datasets are becoming increasingly available through Ollama. Researching models tagged with function calling or tool use on platforms like Hugging Face or checking discussions on communities like r/LocalLLaMA is key to finding capable local options.


πŸ›£οΈ Future Roadmap

  • Add Support HTTP based MCP Servers

  • Add support exposing this client on network as well

  • Store chat locally and provide chat history

  • END-GOAL : Make all reverse engineering MCP Servers, bring them together, to make reverse engineering as easy as possible purely from vibes.

To report bugs, issues, feature suggestion, Performance issue, general question, Documentation issue.

  • Kindly open an issue with respective template.

  • Tested on Mac OS and Linux environment with jadx-mcp-server

πŸ™ Credits

This project is a possible due to ollama, an amazing utility to rung local LLMs. The langchain project,

And in last thanks to @anthropics for developing the Model Context Protocol and @FastMCP team.

Apart from this, huge thanks to all open source projects which serve as a dependencies for this project and which made this possible.

πŸ“„ License

ZIN MCP Client and all related projects inherits the Apache 2.0 License.

βš–οΈ Legal Warning

Disclaimer

The tools zin-mcp-client and zin mcp suite are intended strictly for educational, research, and ethical security assessment purposes. They are provided "as-is" without any warranties, expressed or implied. Users are solely responsible for ensuring that their use of these tools complies with all applicable laws, regulations, and ethical guidelines.

By using zin-mcp-client or zin mcp suite, you agree to use them only in environments you are authorized to test, such as applications you own or have explicit permission to analyze. Any misuse of these tools for unauthorized reverse engineering, infringement of intellectual property rights, or malicious activity is strictly prohibited.

The developers of zin-mcp-client and zin mcp suite shall not be held liable for any damage, data loss, legal consequences, or other consequences resulting from the use or misuse of these tools. Users assume full responsibility for their actions and any impact caused by their usage.

Use responsibly. Respect intellectual property. Follow ethical hacking practices.


πŸ™Œ Contribute or Support

  • Found it useful? Give it a ⭐️
  • Got ideas? Open an issue or submit a PR
  • Built something on top? DM me or mention me β€” I’ll add it to the README!
  • Do you like my work and keep it going? Sponsor this project.

Built with ❀️ for the reverse engineering and AI communities and all the awesome hackers and open source contributors around the world.

About

MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published

Languages