Frontend for MCP (Model Context Protocol) Kit for Go - A Complete MCP solutions for ready to use
-
Updated
Apr 6, 2025 - TypeScript
Frontend for MCP (Model Context Protocol) Kit for Go - A Complete MCP solutions for ready to use
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible plugin system. No data leaves your device.
A modern, local-first AI chat interface built with Tauri, React, and Rust. Secure, private, and supports multiple LLM providers. 🚀
🤖 MORVS AI - An intelligent chat interface powered by Groq's LLaMA 3 model with PDF processing capabilities. Built with Next.js, React, TypeScript, and modern UI components.
This project is an AI chat application interface developed using React for the frontend and Nest.js for the backend. It allows users to interact with an AI assistant through a simple and user-friendly chat interface.
A modern React-based client for Model Context Protocol (MCP) servers with AI-powered chat interface, Azure OpenAI integration, and advanced trace debugging capabilities.
Next.js-powered ChatGPT interface with intelligent API key management. Automatically tests and uses the first valid OpenAI API key, eliminating manual testing and switching.
LLM Chat (UNFINISHED) is a chat interface for LLM (like Llama) via LM Studio, built with React and Node.js. It supports Markdown, history, and automatic titles. The project is incomplete: it lacks web search, images, data persistence, and authentication. MIT License.
The open-source AI chat app for everyone.
Advanced AI conversation platform built with Next.js, TypeScript, and modern AI providers. Features multi-persona chats, real-time conversations, and production-ready architecture for professional AI applications.
Add a description, image, and links to the chat-interface topic page so that developers can more easily learn about it.
To associate your repository with the chat-interface topic, visit your repo's landing page and select "manage topics."