This repository contains a complete example of a chat application powered by a Model Context Protocol (MCP) server written in Python. It demonstrates how to:
- Build and run an MCP Server that interacts with Azure Cosmos DB for NoSQL
- Use
chainlit
andgradio
as MCP clients to build interactive chat UIs - Host both server and client locally or deploy them to Azure
The MCP server exposes tools that let LLM agents insert and query data from Azure Cosmos DB. It uses Managed Identity for authentication and assumes the use of Azure OpenAI for embeddings and LLM responses (you can modify it to use other providers as needed).
azure_containers/cosmosdb/ # MCP Server using Azure Container Apps
azure_functions/cosmosdb/ # MCP Server using Azure Functions
mcp_client/ # MCP Client using Chainlit or Gradio
The MCP server exposes an SSE (Server-Sent Events) endpoint that the client connects to. You can run it locally with Docker or deploy it to Azure using Azure Container Apps or Functions.
Create a .env
file in the root of your server folder with:
ACCOUNT_KEY=<Your Azure Cosmos DB Account Key>
ACCOUNT_ENDPOINT=<Your Azure Cosmos DB Account Endpoint>
Run the following commands:
docker-compose build
docker-compose up -d
Once running, the MCP Server is available at http://localhost:5000/
You can deploy via Azure Container Apps using the VS Code extension. Before that:
- Create an Azure Container App in your subscription.
- Then deploy via VS Code.
Update your .env
with the following GPT model details:
CHAT_MODEL_NAME=<Your GPT deployment name>
CHAT_MODEL_API_KEY=<API key for Azure OpenAI model>
CHAT_MODEL_BASE_URL=<Base URL of your Azure OpenAI deployment>
CHAT_MODEL_API_VERSION=2025-01-01-preview
The MCP client is located in the mcp_client/
folder. It uses either chainlit
or gradio
to create a front-end chat UI connected to the MCP server.
Same .env
as above:
CHAT_MODEL_NAME=<Your GPT deployment name>
CHAT_MODEL_API_KEY=<API key for Azure OpenAI model>
CHAT_MODEL_BASE_URL=<Base URL of your Azure OpenAI deployment>
CHAT_MODEL_API_VERSION=2025-01-01-preview
Install dependencies:
pip install -r requirements.txt
Then run either interface:
Chainlit:
python -m chainlit run chat_app.py
Gradio:
python app.py
The app runs locally at: http://localhost:8000/
💡 Once opened, use the UI plug-in button to set your MCP server SSE endpoint (e.g.,
http://localhost:5000/sse
or your Azure-deployed server).
To deploy to Azure App Service:
- Use the Azure App Service VS Code extension.
- Create an App Service resource in your Azure subscription.
- Deploy the code from
mcp_client/
.
- This sample assumes usage of Azure OpenAI for embeddings and LLMs. To use other providers (e.g., OpenAI, HuggingFace), adjust the API calls accordingly.
- The MCP Server supports Cosmos DB for NoSQL only, but you can extend it to other APIs if needed.
We welcome contributions and feedback! Please open issues or PRs if you'd like to improve the project or have suggestions.