gguf
Here are 94 public repositories matching this topic...
LLM Agent Framework in ComfyUI includes MCP sever, Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai / aisuite interfaces, such as o1,ollama, gemini, grok, qwen, GLM, deepseek, kimi,doubao. Adapted to local llms, vlm, gguf such as llama-3.3 Janus-Pro, Linkage graphRAG
-
Updated
Sep 8, 2025 - Python
Interface for OuteTTS models.
-
Updated
Jun 21, 2025 - Python
Webscout is the all-in-one search and AI toolkit you need. Discover insights with Yep.com, DuckDuckGo, and Phind; access cutting-edge AI models; transcribe YouTube videos; generate temporary emails and phone numbers; perform text-to-speech conversions; and much more!
-
Updated
Dec 9, 2025 - Python
Joy Caption is a ComfyUI node using the LLaVA model to generate stylized image captions, supporting batch processing and GGUF models.
-
Updated
Nov 7, 2025 - Python
A custom ComfyUI node for MiniCPM vision-language models, supporting v4, v4.5, and v4 GGUF formats, enabling high-quality image captioning and visual analysis.
-
Updated
Aug 28, 2025 - Python
Download models from the Ollama library, without Ollama
-
Updated
Nov 13, 2024 - Python
Gradio based tool to run opensource LLM models directly from Huggingface
-
Updated
Jun 27, 2024 - Python
Own your AI, search the web with it🌐😎
-
Updated
Jan 14, 2025 - Python
A Simple Viewer for EXIF and AI Metadata
-
Updated
Nov 15, 2025 - Python
Extract structured data from local or remote LLM models
-
Updated
Jun 21, 2024 - Python
Local character AI chatbot with chroma vector store memory and some scripts to process documents for Chroma
-
Updated
Oct 7, 2024 - Python
PyGPTPrompt: A CLI tool that manages context windows for AI models, facilitating user interaction and data ingestion for optimized long-term memory and task automation.
-
Updated
May 21, 2024 - Python
SwiftLet is a lightweight Python framework for running open-source Large Language Models (LLMs) locally using safetensors
-
Updated
Aug 6, 2025 - Python
GPU-accelerated LLaMA inference wrapper for legacy Vulkan-capable systems a Pythonic way to run AI with knowledge (Ilm) on fire (Vulkan).
-
Updated
Oct 14, 2025 - Python
Improve this page
Add a description, image, and links to the gguf topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gguf topic, visit your repo's landing page and select "manage topics."