Skip to content

Not-Only LLM Chat. An AI application that enhances creativity and user experience beyond just LLM chat. Noted: Seems it beta version of there is issue with DB please clear site Data in debug

License

Notifications You must be signed in to change notification settings

zrg-team/NoLLMChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸš€ NoLLM Chat

A revolutionary AI interaction platform that enhances traditional LLM experiences with a versatile, visual interface for exploring AI technologies directly in your browser.


πŸ“‹ Table of Contents


🎯 Overview

NoLLM Chat revolutionizes AI interaction by providing a platform that goes beyond basic chat interactions. It enables users to interact with language models in ways that boost creativity and enrich their experience through:

  • 🎨 Visual Interface: Node-based workflow creation and management
  • πŸ”§ Browser-Based: Runs locally and free of charge with optional cloud extensions
  • πŸ”„ Workflow Automation: Create custom AI workflows tailored to your needs
  • πŸ“š Comprehensive Learning: Interactive tools for LLMs, prompt engineering, and vector databases

✨ Live Demo

Netlify Status

🌐 Try Live Demo 🌐

Development Progress: [β– β– β– β– β–‘β–‘β–‘β–‘β–‘β–‘] 30%


πŸŽ₯ Features Showcase

πŸ–ΌοΈ Platform Overview

Intro Image

🎬 Interactive Demo

Demo

πŸ’¬ Built-in Chat Application

Demo Chat Application

πŸ“ Built-in Document Editor

Demo Document Editor

πŸ‘¨β€πŸ’» Built-in Code Editor with Sandbox

VSLite Application


🎯 Vision

πŸš€ Enhanced AI Interaction

Move beyond traditional LLM chat with a platform offering a more flexible and visual interface. Users can directly edit and guide AI to improve response quality, enabling richer interaction experiences.

⚑ Automated Personal Workflows

Empowers users to create custom AI workflows tailored to their needs, enhancing productivity and personalization.

🧠 Comprehensive AI Learning

Utilize node-based tools that facilitate interaction with and learning about AI technologies. The platform supports LLMs, prompt engineering, function calls, and vector databases, allowing users to experiment and see the impact of different AI components.

πŸ†“ Free and Browser-Based

Operates locally and free of charge, with the option to extend capabilities using services like OpenAI. This ensures accessibility and ease of use directly from the browser.


πŸ“ Project Structure

src/
β”‚
β”œβ”€β”€ assets/         # Static assets like images and fonts
β”œβ”€β”€ components/     # Reusable React components
β”œβ”€β”€ constants/      # Constant values and configuration settings
β”œβ”€β”€ contexts/       # React context providers for global state management
β”œβ”€β”€ css/            # Styling files (CSS or preprocessor files)
β”œβ”€β”€ hooks/          # Custom React hooks
β”œβ”€β”€ i18n/           # Internationalization setup and resources
β”œβ”€β”€ lib/            # Utility libraries and third-party integrations
β”œβ”€β”€ pages/          # Page components for different routes
β”œβ”€β”€ services/       # API calls and service functions
β”œβ”€β”€ states/         # State management files (e.g., Zustand)
β”œβ”€β”€ utils/          # Utility functions and helpers
β”‚
β”œβ”€β”€ App.tsx         # Main application component
β”œβ”€β”€ main.tsx        # Entry point of the application
└── routes.tsx      # Route configurations

πŸ—οΈ Project Architecture

The architecture is designed to efficiently handle different tasks by dividing them into separate threads. This ensures smooth operation and responsiveness while managing complex processes in the background.

🧡 Thread Architecture

Thread Responsibility Technologies
🎨 Main Thread UI application logic and responsive interface React, ReactFlow, Zustand
πŸ—ƒοΈ Database Worker Data storage and retrieval operations TypeORM, PgLite
πŸ€– LLM Thread Large language model processes and AI computations WebLLM, Langchain
πŸ” Embedding Thread Vector database and embedding model processing Memory Vector DB, Voy
graph LR
    A[Main Thread] <--> C[Database Worker Thread]
    C -->|uses| I((TypeORM))
    I -->|Wraps| D((PGLite))
    A <--> E[LLM Thread]
    E -->|Uses| J((Langchain))
    J -->|Wraps| F((WebLLM))
    A <--> G[(Memory Vector database)]
    G --> K[Embedding thread]
    K -->|Use| L((Embedding Model))
    
    A -->|Handle| B((UI Application Logic))
Loading

πŸ”„ Flow Machine Engine

The Flow Machine is the core orchestration engine that powers NoLLM Chat's workflow capabilities. It provides a sophisticated two-phase execution system for managing complex AI workflows with dynamic data sharing and dependency management.

🎯 Key Features

  • ⚑ Two-Phase Execution: Separate prepare and execute phases for optimal performance
  • πŸ”— Dynamic Dependency Resolution: Automatic discovery of upstream node dependencies
  • πŸ“Š Shared Session State: Seamless data sharing between connected nodes
  • πŸ—οΈ Modular Handler System: Extensible architecture for custom node types
  • πŸ”„ Topological Sorting: Ensures proper execution order with cycle detection

πŸ›οΈ Flow Machine Integration Architecture

graph TB
    subgraph "🎨 UI Layer (Main Thread)"
        RF[ReactFlow Canvas]
        CHAT[Chat Interface]
        EDITOR[Document Editor]
        VSCODE[Code Editor]
    end
    
    subgraph "πŸ”„ Flow Machine Core"
        FM[FlowMachine]
        FDS[FlowDataService]
        HANDLERS[Node Handlers]
        SESSION[Session State]
    end
    
    subgraph "πŸ—ƒοΈ Data Layer"
        DB[(Database Worker)]
        VECTOR[(Vector Database)]
        FILES[File System]
    end
    
    subgraph "πŸ€– AI Processing"
        LLM[LLM Thread]
        EMBED[Embedding Thread]
        TOOLS[Tool Handlers]
    end
    
    subgraph "πŸ”§ Node Types"
        PROMPT[Prompt Nodes]
        LLMNODE[LLM Nodes]
        DATA[Data Nodes]
        SCHEMA[Schema Nodes]
        AGENT[Agent Nodes]
    end
    
    %% UI to Flow Machine connections
    RF --> FM
    CHAT --> FM
    EDITOR --> FM
    VSCODE --> FM
    
    %% Flow Machine internal connections
    FM --> FDS
    FM --> HANDLERS
    FM --> SESSION
    
    %% Data layer connections
    FDS --> DB
    FDS --> VECTOR
    FDS --> FILES
    
    %% AI processing connections
    HANDLERS --> LLM
    HANDLERS --> EMBED
    HANDLERS --> TOOLS
    
    %% Node type connections
    HANDLERS --> PROMPT
    HANDLERS --> LLMNODE
    HANDLERS --> DATA
    HANDLERS --> SCHEMA
    HANDLERS --> AGENT
    
    %% Styling
    classDef uiLayer fill:#e3f2fd
    classDef flowCore fill:#f3e5f5
    classDef dataLayer fill:#e8f5e8
    classDef aiLayer fill:#fff3e0
    classDef nodeTypes fill:#fce4ec
    
    class RF,CHAT,EDITOR,VSCODE uiLayer
    class FM,FDS,HANDLERS,SESSION flowCore
    class DB,VECTOR,FILES dataLayer
    class LLM,EMBED,TOOLS aiLayer
    class PROMPT,LLMNODE,DATA,SCHEMA,AGENT nodeTypes
Loading

πŸ“– Flow Machine Execution Flow

sequenceDiagram
    participant UI as 🎨 UI Interface
    participant FM as πŸ”„ FlowMachine
    participant FDS as πŸ“Š FlowDataService
    participant Handler as πŸ”§ NodeHandler
    participant AI as πŸ€– AI Thread
    participant DB as πŸ—ƒοΈ Database
    
    Note over UI,DB: Workflow Execution Pipeline
    
    UI->>FM: Execute Target Node
    FM->>FDS: Get Connected Nodes
    FDS->>DB: Query Node Dependencies
    DB-->>FDS: Return Graph Data
    FDS-->>FM: FlowGraph Structure
    
    Note over FM,Handler: Phase 1: Preparation
    loop For Each Dependency
        FM->>Handler: prepare(node, context)
        Handler->>AI: Process AI Task
        AI-->>Handler: Return Result
        Handler-->>FM: Preparation Complete
    end
    
    Note over FM,Handler: Phase 2: Execution
    FM->>Handler: execute(targetNode, context)
    Handler->>AI: Execute Main Logic
    AI-->>Handler: Final Result
    Handler-->>FM: Execution Complete
    FM-->>UI: Workflow Results
Loading

πŸ”— Integration Points

Component Integration Purpose Flow Machine Role
ReactFlow Canvas Visual workflow creation Executes user-designed node graphs
Chat Interface Conversational AI flows Orchestrates message processing pipelines
Document Editor AI-assisted writing Manages content generation workflows
Code Editor AI code assistance Handles code analysis and generation flows
Vector Database Semantic search workflows Coordinates embedding and retrieval operations
LLM Thread Language model processing Manages prompt-to-response workflows

πŸ“š Complete Documentation

For detailed technical documentation about the Flow Machine architecture, including:

  • Implementation Details: Core classes and interfaces
  • Node Handler Development: Creating custom node types
  • Execution Context: Session state management
  • Advanced Examples: Complex workflow patterns

πŸ‘‰ Read the Complete Flow Machine Documentation


πŸ› οΈ Libraries and Tools

πŸ—οΈ Core Framework

Technology Purpose Description
Vite Build Tool Fast and modern build tool for web projects
React UI Library Popular JavaScript library for building user interfaces
ReactFlow Node Editor Library for building node-based applications

πŸ—„οΈ Data & Storage

Technology Purpose Description
PGLite Database Lightweight PostgreSQL client for browsers
TypeORM ORM Object-relational mapping with SQLite WASM support
Voy Vector Search WASM vector similarity search engine in Rust
Memory Vector Database Vector Store In-memory embeddings with linear search

πŸ€– AI & LLM Integration

Technology Purpose Description
WebLLM LLM Runtime Run large language models in browser without servers
Langchain AI Framework Framework for developing LLM-powered applications
Langgraph Graph Models Graph-based language model framework

🎨 UI & Styling

Technology Purpose Description
shadcn UI UI Components Modern React component library
Tailwind CSS CSS Framework Utility-first CSS framework
magicui Components Additional UI component library
kokonut Components Specialized component collection

βš™οΈ Development Tools

Technology Purpose Description
React Router Routing Declarative routing for React applications
Zustand State Management Small, fast, and scalable state management
i18next Internationalization Framework for browser internationalization
ESLint Code Linting Pluggable linter for JavaScript patterns
Prettier Code Formatting Opinionated code formatter for consistency

πŸš€ Getting Started

Get up and running with NoLLM Chat in just a few steps:

πŸ“¦ Installation

  1. Clone the Repository

    git clone git@github.com:zrg-team/NoLLMChat.git
  2. Install Dependencies

    cd NoLLMChat
    yarn install
  3. Start Development Server

    yarn dev
  4. Open in Browser Visit http://localhost:PORT to start exploring AI workflows!

πŸ€– Local LLM Support

NoLLM Chat provides native browser-based language model inference without requiring external APIs:

  • 🌐 WebLLM: High-performance inference using WebGPU/WebAssembly with MLC models
  • ⚑ Wllama: Lightweight WASM-based inference with HuggingFace models
  • πŸ”— OpenAI-Compatible API: Unified interface for both providers
  • 🎯 Structured Output: JSON schema support and function calling (WebLLM)
  • πŸ’» Privacy-First: All processing happens locally in your browser

πŸ‘‰ Complete Local LLM Documentation

🎯 Quick Start Guide

  1. Explore the Demo: Try the live demo first
  2. Create Your First Workflow: Use the visual node editor to build AI pipelines
  3. Connect Data Sources: Import your data using CSV, JSONL, or vector databases
  4. Deploy Locally: Run everything in your browser without external dependencies

🀝 Contributing

We welcome contributions from the community! Whether it's:

  • πŸ› Bug fixes
  • ✨ New features
  • πŸ“– Documentation improvements
  • πŸ’‘ Ideas and suggestions

Your help is greatly appreciated! Please check our contribution guidelines for more information.


πŸ“„ License

This project is licensed under the MIT License. See the LICENSE file for more details.


πŸ“ž Contact

Got questions, feedback, or suggestions? We'd love to hear from you!


Built with ❀️ for the AI community

About

Not-Only LLM Chat. An AI application that enhances creativity and user experience beyond just LLM chat. Noted: Seems it beta version of there is issue with DB please clear site Data in debug

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Languages