Use your locally running AI models to assist you in your web browsing
-
Updated
Nov 2, 2025 - TypeScript
Use your locally running AI models to assist you in your web browsing
Run Open Source/Open Weight LLMs locally with OpenAI compatible APIs
Cognito: Supercharge your Chrome browser with AI. Guide, query, and control everything using natural language.
Chat with your pdf using your local LLM, OLLAMA client.
A P2P network where users: Share, trade, and sell AI prompts and prompt chains, Pool and rent distributed compute for inference/training, All transactions happen P2P, with a built-in 3% fee
LocalSeek 🤖💬 is a powerful, Visually stunning, privacy-first AI chat extension for Visual Studio Code.
An ollama based interface, providing agentic abilities to LLMs.
An AI charged chrome extension to read those pesky privacy policies and save you from accidentally agreeing to selling your soul
Vibecode Editor is a fullstack, web-based IDE built with Next.js and Monaco Editor. It features real-time code execution using WebContainers, AI-powered code suggestions via locally running Ollama models, multi-stack templates, an integrated terminal, and a developer-focused UI for seamless coding in the browser.
iFusionOne the one tool you need
Add a description, image, and links to the localllm topic page so that developers can more easily learn about it.
To associate your repository with the localllm topic, visit your repo's landing page and select "manage topics."