Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,8 +108,7 @@ Platform Engineering encompasses several key components:
- **[Cloud-Native Architectures](docs/cloud_native.md)**
- **[Internal Development Platforms](docs/internal_development_platforms.md)**
- **[Kubernetes](docs/kubernetes.md)**
- **[Podcasts](docs/podcasts.md)**
- **[Developer Experience (Local Platform Engineering Toolset)](docs/development_setup.md)**
- **[AI Infrastructure](docs/ai_infrastructure.md)**

### 🏗️ Platform Engineering Reference Architecture

Expand Down
62 changes: 62 additions & 0 deletions docs/ai_infrastructure.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
# 🤖 AI Infrastructure for Platform Engineering

**AI Infrastructure** refers to the foundational systems, tools, and platforms required to develop, deploy, and scale artificial intelligence (AI) and machine learning (ML) workloads. For platform engineering teams, building robust AI infrastructure means enabling data scientists, ML engineers, and developers to efficiently train, serve, and manage AI models—while ensuring scalability, security, and operational excellence.

---

## 🏗️ What Does AI Infrastructure Include?

- **Compute Resources:** High-performance CPUs, GPUs, and TPUs for training and inference.

- **Storage:** Scalable, high-throughput storage for datasets, models, and logs.

- **Networking:** Fast, reliable networking for distributed training and data movement.

- **Orchestration:** Tools like Kubernetes for managing containerized AI workloads.

- **Model Serving:** Systems for deploying and scaling AI models in production (e.g., KServe, Seldon Core).

- **Monitoring & Observability:** Tracking model performance, resource usage, and drift.

- **Security & Compliance:** Managing access, data privacy, and auditability.

---

## 🚀 AI Infrastructure Submodules

Explore these key submodules to learn how platform engineering teams can implement and scale AI infrastructure:

- [**Running LLMs Inside Kubernetes**](./ai_infrastructure/llms_in_kubernetes.md)
*Coming soon: Guidance on how to deploy and manage large language models (LLMs) such as GPT or Llama within Kubernetes clusters for scalable, secure, and integrated AI inference.*

- [**Running LLMs Locally with Azure Foundry Local**](./ai_infrastructure/azure_foundry_local.md)
*Coming soon: How to run large language models (LLMs) on your own infrastructure using Azure Foundry Local for performance, privacy, customization, and cost advantages. [Learn more here.](https://foundry.microsoft.com/local)*

- [**Model Context Protocol (MCP)**](./mcp/model_context_protocol.md)
*An open standard for describing, sharing, and managing the context in which AI models operate—enabling interoperability, reproducibility, and integration across platforms. [Learn more at the official MCP site.](https://modelcontextprotocol.io/introduction#general-architecture)*

---

## 🌐 Why AI Infrastructure Matters for Platform Engineering

- **Scalability:** Meet growing AI/ML workload demands.

- **Efficiency:** Automate deployment, scaling, and monitoring of models.

- **Security:** Enforce policies and compliance for sensitive data and models.

- **Innovation:** Enable rapid experimentation and faster time-to-value for AI initiatives.

---

## 📚 Further Reading

- [KServe: Model Serving on Kubernetes](https://kserve.github.io/website/)
- [Azure Machine Learning + Kubernetes](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-attach-kubernetes-anywhere?view=azureml-api-2)
- [Seldon Core: Open Source Model Serving](https://docs.seldon.io/projects/seldon-core/en/latest/)
- [Google Vertex AI](https://cloud.google.com/vertex-ai)
- [Model Context Protocol (Official Site)](https://modelcontextprotocol.io/introduction#general-architecture)

---

AI infrastructure is a critical enabler for modern platform engineering, empowering teams to deliver intelligent applications at scale—whether running on Kubernetes, leveraging managed cloud platforms, or combining both approaches.
44 changes: 44 additions & 0 deletions docs/mcp/getting_started_with_mcp.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# 🧠 Top Model Context Protocol (MCP) Servers

Welcome to the **Code to Cloud Platform Engineering** knowledge hub. This document highlights the most relevant and impactful **Model Context Protocol (MCP)** servers available today. MCP servers bridge AI agents, developer tools, and cloud-native infrastructure via a standardized JSON-RPC interface.

MCP servers enable advanced automation, reasoning, and orchestration by exposing authoritative APIs to AI copilots, developer environments, and multi-agent systems.

---

## 🔝 Top MCP Servers

| Rank | MCP Server | Description | Maintainer | Link |
|------|------------|-------------|------------|------|
| 1 | **Microsoft Docs MCP Server** | Provides trusted, up-to-date results from Microsoft Docs, allowing LLMs to ground responses in official Azure and Microsoft documentation. Ideal for copilots and agents that need authoritative technical answers, code samples, and best practices directly from Microsoft’s knowledge base. | Microsoft | [Microsoft Docs MCP Server on GitHub](https://github.com/MicrosoftDocs/mcp) |
| 2 | **GitHub MCP Server** | Connects to GitHub’s API for real-time access to pull requests, issues, repositories, and workflows. Enables developer automation, code review, and repository insights through natural language prompts. Useful for copilots that assist with code management, CI/CD, and project tracking. | GitHub | [GitHub MCP Server on GitHub](https://github.com/github/github-mcp-server) |
| 3 | **Terraform MCP Server** | Bridges the Terraform Registry with AI tooling, enabling LLMs to search, recommend, and validate infrastructure-as-code modules and policies. Supports DevOps workflows by providing context-aware IaC guidance, compliance checks, and resource documentation. | HashiCorp | [Terraform MCP Server on GitHub](https://github.com/hashicorp/terraform-mcp-server) |
| 4 | **Azure DevOps MCP Server** | Exposes Azure DevOps resources such as pipelines, repositories, and boards. Allows copilots and agents to interact with CI/CD systems, automate deployments, monitor build status, and manage work items using conversational commands. | Microsoft | [Azure DevOps MCP Server on GitHub](https://github.com/microsoft/azure-devops-mcp) |
| 5 | **Azure MCP Server** | Exposes a wide range of Azure tools and services for natural language interaction. Use it to manage and query Azure resources conversationally from clients like GitHub Copilot agent mode in VS Code or other AI agents.<br/>Example prompts:<br/>• "Show me all my resource groups"<br/>• "List blobs in my storage container named 'documents'"<br/>• "What's the value of the 'ConnectionString' key in my app configuration?"<br/>• "Query my log analytics workspace for errors in the last hour"<br/>• "Show me all my Cosmos DB databases" | Microsoft | [Azure MCP Server on GitHub](https://github.com/Azure/azure-mcp) |
| 6 | **Dagger Container Use MCP Server** | Lets each coding agent have its own containerized environment. Enables multiple agents to work safely and independently with your preferred stack.<br/>Features:<br/>• 📦 Isolated Environments<br/>• 👀 Real-time Visibility<br/>• 🚁 Direct Intervention<br/>• 🎮 Environment Control<br/>• 🌎 Universal Compatibility<br/>Works as a CLI tool with Claude Code, VS Code, Cursor, and other MCP-compatible agents. | Dagger | [Dagger Container Use MCP Server on GitHub](https://github.com/dagger/container-use) |
| 7 | **npm MCP Server** | Interfaces with the npm registry, allowing copilots to suggest, search, and retrieve metadata for JavaScript/TypeScript packages. Supports package discovery, dependency management, and vulnerability checks directly from development environments. | Community | [npm MCP Server on GitHub](https://www.npmjs.com/package/mcp-server) |
| 8 | **Figma Context MCP Server** | Integrates Figma design data with LLMs and agents, enabling natural language access to design files, components, and project assets. Useful for design copilots, automated documentation, and design-to-code workflows. | Community | [Figma Context MCP Server on GitHub](https://github.com/GLips/Figma-Context-MCP) |
| 9 | **OpenAPI MCP Server Scaffold** | A boilerplate for wrapping any OpenAPI-compliant service into an MCP server. Ideal for quickly exposing internal APIs or SaaS products to LLMs and agents, enabling natural language access to custom business logic and data. | Community | [OpenAPI MCP Docs Scaffold](https://platform.openai.com/docs/mcp) |
| 10 | **Notion MCP Server** | Connects LLMs and agents to Notion workspaces, enabling natural language access to notes, documents, databases, and project management resources. Useful for productivity copilots, knowledge management, and workflow automation within Notion. | Notion | [Notion MCP Server on GitHub](https://github.com/makenotion/notion-mcp-server) |

---

## ⚙️ Why MCP Matters for Platform Engineering

Model Context Protocol enables:

- ✨ **Grounded AI assistants:** Context-aware and source-authoritative copilots.
- ⚙️ **Automated CI/CD and IaC workflows:** Natural language interfaces for DevOps.
- 📊 **Insights and analytics:** From source control or artifact registries.
- 🧠 **Multi-agent cooperation:** In cloud-native developer environments.

---

## 🧪 **Contributing**

**Have you built or deployed your own MCP server for internal APIs, cloud systems, or SaaS products?**
We’d love to feature community-contributed MCP endpoints here. Open a PR or reach out via our [contributing page](../../CONTRIBUTING.md).

---

🏔️ [codetocloud.io](https://codetocloud.io)
Binary file added docs/mcp/images/mpc_architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
44 changes: 44 additions & 0 deletions docs/mcp/model_context_protocol.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# 🔗 Model Context Protocol (MCP) Overview

The **Model Context Protocol (MCP)** is an open protocol that standardizes how applications provide context to large language models (LLMs). Think of MCP like a USB-C port for AI: it offers a universal way to connect AI models to various data sources and tools.

---

## 🚀 Why Use MCP?

- **Plug-and-Play Integrations:** Easily connect your LLM to a growing list of pre-built integrations.
- **Provider Flexibility:** Switch between LLM vendors without changing your integration approach.
- **Security Best Practices:** Keep your data secure within your own infrastructure.

---

## 🏗️ MCP Architecture

<p align="center">
<img src="./images/mpc_architecture.png" alt="MCP Architecture Diagram" width="40%">
</p>

MCP uses a client-server architecture, allowing host applications to connect to multiple servers and data sources:

- **MCP Hosts:** Apps like Claude Desktop, IDEs (VS Code), or AI tools accessing data via MCP.
- **MCP Clients:** Maintain 1:1 connections with MCP servers.
- **MCP Servers:** Lightweight programs exposing capabilities through MCP.
- **Local Data Sources:** Files, databases, and services on your computer.
- **Remote Services:** External APIs and internet services.

This setup lets LLMs and AI tools securely access both local and remote data, enabling powerful, context-aware AI applications.

---

## 🚦 Getting Started with MCPs

To help code-to-cloud teams quickly leverage the Model Context Protocol, we’ve curated a list of recommended MCP servers that can boost your development productivity. These servers provide ready-to-use integrations with common data sources, tools, and services.

Explore the submodule below for setup instructions, usage examples, and best practices:

- [**Getting Started with MCP Servers**](getting_started_with_mcps.md)
*A curated guide to installing, configuring, and using popular MCP servers for rapid integration with your LLM workflows.*

---

**Learn more:** [Model Context Protocol Introduction](https://modelcontextprotocol.io/introduction#general-architecture)
File renamed without changes.
Loading