A revolutionary AI interaction platform that enhances traditional LLM experiences with a versatile, visual interface for exploring AI technologies directly in your browser.
- π― Overview
- β¨ Live Demo
- π₯ Features Showcase
- π― Vision
- ποΈ Project Architecture
- π Flow Machine Engine
- π Project Structure
- π οΈ Libraries and Tools
- π Getting Started
- π€ Contributing
- π License
- π Contact
NoLLM Chat revolutionizes AI interaction by providing a platform that goes beyond basic chat interactions. It enables users to interact with language models in ways that boost creativity and enrich their experience through:
- π¨ Visual Interface: Node-based workflow creation and management
- π§ Browser-Based: Runs locally and free of charge with optional cloud extensions
- π Workflow Automation: Create custom AI workflows tailored to your needs
- π Comprehensive Learning: Interactive tools for LLMs, prompt engineering, and vector databases
Development Progress: [β β β β β‘β‘β‘β‘β‘β‘] 30%
Move beyond traditional LLM chat with a platform offering a more flexible and visual interface. Users can directly edit and guide AI to improve response quality, enabling richer interaction experiences.
Empowers users to create custom AI workflows tailored to their needs, enhancing productivity and personalization.
Utilize node-based tools that facilitate interaction with and learning about AI technologies. The platform supports LLMs, prompt engineering, function calls, and vector databases, allowing users to experiment and see the impact of different AI components.
Operates locally and free of charge, with the option to extend capabilities using services like OpenAI. This ensures accessibility and ease of use directly from the browser.
src/
β
βββ assets/ # Static assets like images and fonts
βββ components/ # Reusable React components
βββ constants/ # Constant values and configuration settings
βββ contexts/ # React context providers for global state management
βββ css/ # Styling files (CSS or preprocessor files)
βββ hooks/ # Custom React hooks
βββ i18n/ # Internationalization setup and resources
βββ lib/ # Utility libraries and third-party integrations
βββ pages/ # Page components for different routes
βββ services/ # API calls and service functions
βββ states/ # State management files (e.g., Zustand)
βββ utils/ # Utility functions and helpers
β
βββ App.tsx # Main application component
βββ main.tsx # Entry point of the application
βββ routes.tsx # Route configurations
The architecture is designed to efficiently handle different tasks by dividing them into separate threads. This ensures smooth operation and responsiveness while managing complex processes in the background.
Thread | Responsibility | Technologies |
---|---|---|
π¨ Main Thread | UI application logic and responsive interface | React, ReactFlow, Zustand |
ποΈ Database Worker | Data storage and retrieval operations | TypeORM, PgLite |
π€ LLM Thread | Large language model processes and AI computations | WebLLM, Langchain |
π Embedding Thread | Vector database and embedding model processing | Memory Vector DB, Voy |
graph LR
A[Main Thread] <--> C[Database Worker Thread]
C -->|uses| I((TypeORM))
I -->|Wraps| D((PGLite))
A <--> E[LLM Thread]
E -->|Uses| J((Langchain))
J -->|Wraps| F((WebLLM))
A <--> G[(Memory Vector database)]
G --> K[Embedding thread]
K -->|Use| L((Embedding Model))
A -->|Handle| B((UI Application Logic))
The Flow Machine is the core orchestration engine that powers NoLLM Chat's workflow capabilities. It provides a sophisticated two-phase execution system for managing complex AI workflows with dynamic data sharing and dependency management.
- β‘ Two-Phase Execution: Separate prepare and execute phases for optimal performance
- π Dynamic Dependency Resolution: Automatic discovery of upstream node dependencies
- π Shared Session State: Seamless data sharing between connected nodes
- ποΈ Modular Handler System: Extensible architecture for custom node types
- π Topological Sorting: Ensures proper execution order with cycle detection
graph TB
subgraph "π¨ UI Layer (Main Thread)"
RF[ReactFlow Canvas]
CHAT[Chat Interface]
EDITOR[Document Editor]
VSCODE[Code Editor]
end
subgraph "π Flow Machine Core"
FM[FlowMachine]
FDS[FlowDataService]
HANDLERS[Node Handlers]
SESSION[Session State]
end
subgraph "ποΈ Data Layer"
DB[(Database Worker)]
VECTOR[(Vector Database)]
FILES[File System]
end
subgraph "π€ AI Processing"
LLM[LLM Thread]
EMBED[Embedding Thread]
TOOLS[Tool Handlers]
end
subgraph "π§ Node Types"
PROMPT[Prompt Nodes]
LLMNODE[LLM Nodes]
DATA[Data Nodes]
SCHEMA[Schema Nodes]
AGENT[Agent Nodes]
end
%% UI to Flow Machine connections
RF --> FM
CHAT --> FM
EDITOR --> FM
VSCODE --> FM
%% Flow Machine internal connections
FM --> FDS
FM --> HANDLERS
FM --> SESSION
%% Data layer connections
FDS --> DB
FDS --> VECTOR
FDS --> FILES
%% AI processing connections
HANDLERS --> LLM
HANDLERS --> EMBED
HANDLERS --> TOOLS
%% Node type connections
HANDLERS --> PROMPT
HANDLERS --> LLMNODE
HANDLERS --> DATA
HANDLERS --> SCHEMA
HANDLERS --> AGENT
%% Styling
classDef uiLayer fill:#e3f2fd
classDef flowCore fill:#f3e5f5
classDef dataLayer fill:#e8f5e8
classDef aiLayer fill:#fff3e0
classDef nodeTypes fill:#fce4ec
class RF,CHAT,EDITOR,VSCODE uiLayer
class FM,FDS,HANDLERS,SESSION flowCore
class DB,VECTOR,FILES dataLayer
class LLM,EMBED,TOOLS aiLayer
class PROMPT,LLMNODE,DATA,SCHEMA,AGENT nodeTypes
sequenceDiagram
participant UI as π¨ UI Interface
participant FM as π FlowMachine
participant FDS as π FlowDataService
participant Handler as π§ NodeHandler
participant AI as π€ AI Thread
participant DB as ποΈ Database
Note over UI,DB: Workflow Execution Pipeline
UI->>FM: Execute Target Node
FM->>FDS: Get Connected Nodes
FDS->>DB: Query Node Dependencies
DB-->>FDS: Return Graph Data
FDS-->>FM: FlowGraph Structure
Note over FM,Handler: Phase 1: Preparation
loop For Each Dependency
FM->>Handler: prepare(node, context)
Handler->>AI: Process AI Task
AI-->>Handler: Return Result
Handler-->>FM: Preparation Complete
end
Note over FM,Handler: Phase 2: Execution
FM->>Handler: execute(targetNode, context)
Handler->>AI: Execute Main Logic
AI-->>Handler: Final Result
Handler-->>FM: Execution Complete
FM-->>UI: Workflow Results
Component | Integration Purpose | Flow Machine Role |
---|---|---|
ReactFlow Canvas | Visual workflow creation | Executes user-designed node graphs |
Chat Interface | Conversational AI flows | Orchestrates message processing pipelines |
Document Editor | AI-assisted writing | Manages content generation workflows |
Code Editor | AI code assistance | Handles code analysis and generation flows |
Vector Database | Semantic search workflows | Coordinates embedding and retrieval operations |
LLM Thread | Language model processing | Manages prompt-to-response workflows |
For detailed technical documentation about the Flow Machine architecture, including:
- Implementation Details: Core classes and interfaces
- Node Handler Development: Creating custom node types
- Execution Context: Session state management
- Advanced Examples: Complex workflow patterns
π Read the Complete Flow Machine Documentation
Technology | Purpose | Description |
---|---|---|
Vite | Build Tool | Fast and modern build tool for web projects |
React | UI Library | Popular JavaScript library for building user interfaces |
ReactFlow | Node Editor | Library for building node-based applications |
Technology | Purpose | Description |
---|---|---|
PGLite | Database | Lightweight PostgreSQL client for browsers |
TypeORM | ORM | Object-relational mapping with SQLite WASM support |
Voy | Vector Search | WASM vector similarity search engine in Rust |
Memory Vector Database | Vector Store | In-memory embeddings with linear search |
Technology | Purpose | Description |
---|---|---|
WebLLM | LLM Runtime | Run large language models in browser without servers |
Langchain | AI Framework | Framework for developing LLM-powered applications |
Langgraph | Graph Models | Graph-based language model framework |
Technology | Purpose | Description |
---|---|---|
shadcn UI | UI Components | Modern React component library |
Tailwind CSS | CSS Framework | Utility-first CSS framework |
magicui | Components | Additional UI component library |
kokonut | Components | Specialized component collection |
Technology | Purpose | Description |
---|---|---|
React Router | Routing | Declarative routing for React applications |
Zustand | State Management | Small, fast, and scalable state management |
i18next | Internationalization | Framework for browser internationalization |
ESLint | Code Linting | Pluggable linter for JavaScript patterns |
Prettier | Code Formatting | Opinionated code formatter for consistency |
Get up and running with NoLLM Chat in just a few steps:
-
Clone the Repository
git clone git@github.com:zrg-team/NoLLMChat.git
-
Install Dependencies
cd NoLLMChat yarn install
-
Start Development Server
yarn dev
-
Open in Browser Visit
http://localhost:PORT
to start exploring AI workflows!
NoLLM Chat provides native browser-based language model inference without requiring external APIs:
- π WebLLM: High-performance inference using WebGPU/WebAssembly with MLC models
- β‘ Wllama: Lightweight WASM-based inference with HuggingFace models
- π OpenAI-Compatible API: Unified interface for both providers
- π― Structured Output: JSON schema support and function calling (WebLLM)
- π» Privacy-First: All processing happens locally in your browser
π Complete Local LLM Documentation
- Explore the Demo: Try the live demo first
- Create Your First Workflow: Use the visual node editor to build AI pipelines
- Connect Data Sources: Import your data using CSV, JSONL, or vector databases
- Deploy Locally: Run everything in your browser without external dependencies
We welcome contributions from the community! Whether it's:
- π Bug fixes
- β¨ New features
- π Documentation improvements
- π‘ Ideas and suggestions
Your help is greatly appreciated! Please check our contribution guidelines for more information.
This project is licensed under the MIT License. See the LICENSE file for more details.
Got questions, feedback, or suggestions? We'd love to hear from you!
- π§ Email: zerglingno2@outlook.com
- π Issues: Open an issue on GitHub
- π¬ Discussions: Join our community discussions
Built with β€οΈ for the AI community