Skip to content

tiger1990/NeuralChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

neural-chat

AI chat interface for the browser. Connects to OpenAI GPT models via a local backend with a modern sidebar layout, real-time responses, and response timing metrics.

Screenshot 2026-03-17 at 11 17 16 PM

Features

  • Multi-model support — switch between GPT-4o mini, GPT-4o, and GPT-4.1
  • Dark theme UI — ambient animated background with a polished dark design
  • Sidebar layout — recent conversations, model status indicator, new chat button
  • Quick-start chips — one-click prompt suggestions on the welcome screen
  • Response timing — displays how long each response took in seconds
  • Responsive design — works on desktop, tablet, and mobile
  • Keyboard shortcutsEnter to send, Shift+Enter for new line

Tech Stack

  • Frontend — Vanilla HTML, CSS, JavaScript
  • Fonts — Syne (display) + DM Sans (body) via Google Fonts
  • Backend — Python (Flask / FastAPI or similar) serving /generate endpoint
  • AI — OpenAI API (GPT-4o mini, GPT-4o, GPT-4.1)

Project Structure

neural-chat/
├── index.html       # Main HTML structure
├── styles.css       # All styles and responsive layout
├── script.js        # Frontend logic, API calls, message rendering
├── app.py           # Backend server (your existing backend)
└── README.md

Getting Started

Prerequisites

  • Python 3.8+
  • OpenAI API key

Installation

  1. Clone the repository

    git clone https://github.com/yourusername/neural-chat.git
    cd neural-chat
  2. Install backend dependencies

    pip install flask openai
  3. Set your OpenAI API key

    export OPENAI_API_KEY=your_api_key_here
    or Replace your_api_key_here in .env
  4. Run the server

    python app.py
  5. Open in browser

    http://localhost:5000
    

Backend API

The frontend expects a single endpoint:

POST /generate

Request body:

{
  "message": "What is AI?",
  "model": "gpt-4o-mini"
}

Response:

{
  "response": "AI stands for...",
  "duration": 1.42
}

Error response:

{
  "error": "Something went wrong"
}

Configuration

Static files (styles.css, script.js) are served from the /static/ path. Make sure your backend serves them accordingly, e.g. in Flask:

from flask import Flask, send_from_directory

app = Flask(__name__, static_folder='static')

Place styles.css and script.js inside a static/ folder, or adjust the paths in index.html to match your setup.


Customization

What Where
App name / branding index.html.logo-text and <title>
Accent color styles.css--accent CSS variable
Available models index.html<select id="modelSelect"> options
Welcome chips index.html.chip buttons in #welcomeScreen
Background orb colors styles.css.orb-1, .orb-2, .orb-3

License

MIT

About

AI chat interface for the browser that connects to OpenAI GPT models (4o-mini, 4o, 4.1) via a local backend, with a responsive sidebar layout, conversation history, and real-time response timing.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors