Skip to content

A lightweight experimental project that streams live camera data from the browser to a Node.js backend using the WebSocket protocol.

Notifications You must be signed in to change notification settings

Rpaudel379/realtime-camera-streaming-websocket

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

🎥 Camera Streamer

Real-Time Camera Streaming over WebSockets

A lightweight experimental project that streams live camera data from the browser to a Node.js backend using the WebSocket protocol. The backend saves the streamed data as a video file.

This project was built purely for learning and analysis purposes — to understand how real-time streaming can be implemented using WebSockets. While this is not the most efficient method for video streaming, it provides great insight into how data packets travel between a browser and a server in real-time.


🚀 Overview

The app allows you to:

  1. Open your camera directly from the browser.
  2. Start recording video.
  3. Stream the video data live to the server through a WebSocket connection.
  4. The server writes the received video chunks to a file using Node’s fs.createWriteStream.

Once recording stops, the WebSocket connection closes and the video is finalized on the server.


🧠 How It Works (Logic Breakdown)

  1. Prepare Stream When the user clicks Record, the client sends a GET request to the /prepare endpoint. The server:

    • Generates a unique stream ID.
    • Creates a write stream using fs.createWriteStream().
    • Stores this stream in a shared in-memory map (like a HashMap or in future, Redis).
    • Responds to the client with the generated stream ID.
  2. Connect WebSocket The client receives the stream ID and connects to:

    ws://localhost:8080?key=<streamId>
    
  3. Validate Connection The server validates the key (stream ID).

    • If valid → starts listening for binary data (video chunks).
    • If invalid → closes the socket immediately.
  4. Stream Data The browser uses the MediaRecorder API to record camera frames. Each recorded video chunk (Blob) is sent through the WebSocket to the server. The server writes these chunks to the file stream corresponding to that key.

  5. End Stream When recording stops:

    • The client closes the WebSocket connection.
    • The server detects the closure and ends the file write stream.

✨ Features

  • ✅ Concurrent streaming — each client has an isolated session and file
  • 🔄 Real-time streaming of video data via WebSocket
  • 🎥 Record and stream camera footage directly from your browser
  • 💾 Saves video streams as .webm files on the server
  • 🧩 Simple client–server architecture for easy learning
  • 🧠 Demonstrates how binary data travels over WebSockets
  • ⚙️ Built entirely with native browser APIs and Node.js — no third-party media libraries

🧰 Tech Stack

Frontend

  • React + Vite
  • MediaRecorder API (for recording camera data)
  • Native WebSocket API (for live streaming)

Backend

  • Node.js Express
  • ws (WebSocket library)
  • fs (for writing streams to files)

⚠️ A Note on Efficiency

While this works, using WebSockets for video streaming is not efficient for production. WebSockets weren’t designed for high-performance real-time media transfer — especially video/audio frames.

🧩 Better Alternatives

If you plan to build production-grade real-time streaming, look into:

  • 🎞 WebRTC — Peer-to-peer, designed for real-time video/audio streaming
  • WebTransport — Successor to WebSockets, supports bidirectional low-latency streams over QUIC
  • 🧮 WebCodecs API — Gives direct access to encoded media frames for more efficient handling

These technologies are optimized for latency, performance, and synchronization, unlike WebSockets.


🛠 Getting Started

1. Clone the repository

git clone https://github.com/rpaudel379/realtime-camera-streaming-websocket.git
cd websocket-camera-streamer

2. Install dependencies

For both client and server folders:

npm install
# or
bun install

3. Start the backend server

cd server
npm run dev
# or
bun run dev

4. Start the frontend app

cd client
npm run dev
# or
bun run dev

5. Open the app

Visit:

http://localhost:5173

Then:

  1. Click Open Camera.
  2. Click Record — this triggers /prepare, opens a WebSocket, and starts streaming.
  3. Stop recording to finalize the video on the backend.

🚧 Future Improvements

  • Store video data in AWS S3 or cloud storage.
  • Replace in-memory session map with Redis or database for distributed scalability.
  • Use WebRTC or WebTransport for better streaming performance.
  • Implement authentication and access control for streams.
  • Add a simple playback UI to preview recorded videos.
  • Enable file naming, progress indicators.

🤝 Contributing

Contributions are welcome! If you’d like to improve this project (optimize streaming, refactor architecture, or add new features):

  1. Fork the repo

  2. Create a new branch

    git checkout -b feature/amazing-improvement
  3. Commit your changes

  4. Push to your fork

  5. Open a Pull Request 🎉


📚 Purpose

This project is not meant to be a production-grade streaming solution. It’s a learning experiment to:

  • Explore the WebSocket protocol for binary data transfer.
  • Understand media handling in the browser.
  • Bridge frontend–backend communication for real-time applications.

🧾 License

Feel free to use it, modify it, and learn from it.


Built for learning. Inspired by curiosity.

About

A lightweight experimental project that streams live camera data from the browser to a Node.js backend using the WebSocket protocol.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published