Delta: LLM conversation branching
-
Updated
Dec 30, 2024 - TypeScript
Delta: LLM conversation branching
Playground for learning by doing
A terminal-based tool for building flexible AI workflows anywhere. Process documents, create pipelines, and manage context from the command line.
Universal local AI agent for querying any MCP-enabled data source using Ollama - vaults, databases, emissions data, and more. 100% offline, 100% sovereign.
An entirely offline, privacy-centric voice assistant that leverages lightweight local AI for speech-to-text (Vosk), large language model processing (GGUF via Llama.cpp), and text-to-speech (Kokoro), offering seamless, low-latency, and secure voice interactions directly from your machine.
(Experiment) Predefined set of instructions for local agents governing LLM usage and selection
Add a description, image, and links to the local-llms topic page so that developers can more easily learn about it.
To associate your repository with the local-llms topic, visit your repo's landing page and select "manage topics."