A lightweight local LLM chat with a web UI and a C‑based server that runs any LLM chat executable as a child and communicates via pipes
-
Updated
Nov 2, 2025 - C
A lightweight local LLM chat with a web UI and a C‑based server that runs any LLM chat executable as a child and communicates via pipes
A lightweight local LLM chat with a web UI and a C‑based server that runs any LLM chat executable as a child and communicates via pipes
Add a description, image, and links to the single-session topic page so that developers can more easily learn about it.
To associate your repository with the single-session topic, visit your repo's landing page and select "manage topics."