Skip to content

tedbrunell/llm-mcp-streamlit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-mcp-streamlit

Overview

This project is mostly a fork of the LLaMa MCP Streamlit project. A few things are different though:

  1. The MCP server used in this demo has been canged from Playwright MCP to Google Maps MCP.
  2. The .env file has been altered to use a locally hosted LLM that takes advantage of the OpenAI API.
  3. Slight change to the mcp_client script to account for way Google Maps lists its tools.
  4. Drastic shortening of the system prompt.

There are still a few things that I want to work on:

  1. Containerize the application
  2. Determine how multiple tools can be used
  3. Mix Stdio and SSE servers

Running the code

  1. Clone this repo
  2. Obtain a Google Maps API Key from this site.
  3. Rename the .env.example file to .env
  4. Enter your inference server's API URL, API Key, and Google Maps API Key into the appropriate fileds in the .env file
  5. Execute the run.sh script or run poetry run streamlit run llm-mcp-streamlit/main.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published