A generalized information-seeking agent system with Large Language Models (LLMs).
-
Updated
Jun 19, 2024 - Python
A generalized information-seeking agent system with Large Language Models (LLMs).
[ICML 2024] SqueezeLLM: Dense-and-Sparse Quantization
[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
The PyVisionAI Official Repo
MVP of an idea using multiple local LLM models to simulate and play D&D
A local chatbot for managing docs
LocalLab allows you to easily run Hugging Face AI models locally or on Google Colab, featuring automatic API setup, model management, performance optimization, and system monitoring.
Demo project showcasing Gemma3 function calling capabilities using Ollama. Enables automatic web searches via Serper.dev for up-to-date information and features an interactive Gradio chat interface.
Read your local files and answer your queries
Local AI Open Orca For Dummies is a user-friendly guide to running Large Language Models locally. Simplify your AI journey with easy-to-follow instructions and minimal setup. Perfect for developers tired of complex processes!
A minimal CLI tool to locally summarize any text using LLM!
Add a description, image, and links to the localllm topic page so that developers can more easily learn about it.
To associate your repository with the localllm topic, visit your repo's landing page and select "manage topics."