This project is mostly a fork of the LLaMa MCP Streamlit project. A few things are different though:
- The MCP server used in this demo has been canged from Playwright MCP to Google Maps MCP.
- The .env file has been altered to use a locally hosted LLM that takes advantage of the OpenAI API.
- Slight change to the mcp_client script to account for way Google Maps lists its tools.
- Drastic shortening of the system prompt.
There are still a few things that I want to work on:
- Containerize the application
- Determine how multiple tools can be used
- Mix Stdio and SSE servers
- Clone this repo
- Obtain a Google Maps API Key from this site.
- Rename the .env.example file to .env
- Enter your inference server's API URL, API Key, and Google Maps API Key into the appropriate fileds in the .env file
- Execute the run.sh script or run
poetry run streamlit run llm-mcp-streamlit/main.py