This repository has code to securely run SLM (Small language models) locally using nodejs (servers side) or inside browser .
-
Updated
Nov 25, 2025 - JavaScript
This repository has code to securely run SLM (Small language models) locally using nodejs (servers side) or inside browser .
A lightweight frontend for LM Studio local server APIs. Built using React, Vite, and Tailwind CSS with full support for streaming responses and GitHub Flavored Markdown.
Always-on companion for Claude that remembers your decisions and their evolution. Local-first memory using SQLite + transformers.js embeddings.
Blog resources for building a self-hosted AI infrastructure. Contains all code samples and configurations from the tutorial series.
Add a description, image, and links to the local-llm-integration topic page so that developers can more easily learn about it.
To associate your repository with the local-llm-integration topic, visit your repo's landing page and select "manage topics."