Skip to content

Muhammad-Ayman/Chatting-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Full-Stack Chat App (React + NestJS + Python gRPC)

Architecture

React (WebSocket) ⇄ NestJS (WS Gateway + gRPC client) ⇄ Python (gRPC streaming mock LLM)

Structure

  • /frontend — React + Vite app
  • /backend — NestJS app with WS gateway and gRPC client
  • /llm_service — Python gRPC server simulating an LLM
  • /proto — Protocol buffers definitions

Prerequisites

  • Node.js 18+
  • Python 3.10+

Install

npm install
cd backend && npm install
cd ../frontend && npm install
cd ../llm_service && python -m pip install -r requirements.txt

Run

In three terminals:

npm run start:llm
npm run start:backend
npm run start:frontend

Open http://localhost:5173.

Notes

  • WebSocket endpoint: frontend connects to ws://localhost:3000.
  • Backend proxies chat:send events to the Python gRPC ChatStream and relays streamed chunks back as chat:chunk events.
  • gRPC uses proto at proto/chat.proto.

About

Chat with llms

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published