A lightweight framework for streaming video frames from a Python-based server to an iOS/macOS app over UDP. The server captures webcam footage, processes it into PNG images, and sends it to the client, which uses Combine to publish received UIImage
objects.
- Server (Python): Captures video frames using OpenCV, crops and converts them to PNG with Pillow, and sends them over UDP to
127.0.0.1:5005
. - Client (Swift): An iOS/macOS app that listens on UDP port 5005 using
Network.framework
, reconstructs PNG images intoUIImage
objects, and publishes them via a CombineAnyPublisher
.
- Real-time video streaming over UDP.
- Configurable screen sizes for cropping frames (via
displays.csv
). - Combine integration for reactive image handling in Swift.
- Command-line options for verbose mode and device selection.
- Python 3.6+
- Libraries:
numpy
opencv-python
pillow
pandas
- Install dependencies:
pip install numpy opencv-python pillow pandas
- A webcam (or modify
server.py
to use a video file).
- Xcode 12+ (for
Network.framework
and Combine support) - iOS 13+ or macOS 10.15+
- Swift 5+
CamBridge/
: iOS/macOS app directory with Swift source files and assets.resources/
: Containsdisplays.csv
for screen size configurations.server.py
: Python script for streaming video frames.README.md
: This file.
-
Clone the Repository:
git clone https://github.com/tinyprocessing/CamBridge.git cd CamBridge
-
Prepare the Server:
- Ensure
displays.csv
is in theresources/
directory with columns:device,width,height
(e.g.,11,414,896
for iPhone 11).
- Ensure
-
Prepare the Client:
- Open
CamBridge.xcodeproj
in Xcode. - Build the project for your target (iOS Simulator or device).
- Open
-
Navigate to the project root:
cd CamBridge
-
Run the server:
python server.py
-
Optional arguments:
-v
or--verbose
: Enable verbose logging.-h
or--help
: Show help and exit.<device>
: Specify a device name (e.g.,11
for iPhone 11 screen size).
Example:
python server.py --verbose 11
-
Press
q
to stop the server.
- In
ViewController.swift
(or another appropriate file), set up the communicator:import UIKit import Combine class ViewController: UIViewController { @IBOutlet weak var imageView: UIImageView! private var communicator: InterProcessCommunicator? private var cancellables = Set<AnyCancellable>() override func viewDidLoad() { super.viewDidLoad() communicator = InterProcessCommunicator() communicator?.connect() communicator?.imagePublisher .receive(on: DispatchQueue.main) .sink { [weak self] image in self?.imageView.image = image } .store(in: &cancellables) } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) communicator?.detachConnection() } }
- Connect an
UIImageView
in your storyboard to the@IBOutlet
. - Build and run the app in Xcode.
-
Server:
- Captures frames from the webcam.
- Crops frames to the specified screen size (from
displays.csv
). - Converts frames to PNG and sends them over UDP with a
.
delimiter between images.
-
Client:
- Listens on UDP port 5005.
- Buffers incoming packets until a
.
delimiter is received. - Publishes reconstructed
UIImage
objects via a CombinePassthroughSubject
.
- Symptoms:
nw_listener_inbox_accept_udp socket() failed [24: Too many open files]
. - Fixes:
- Ensure
server.py
reuses a single socket (fixed in the provided code). - Call
detachConnection()
when the Swift app stops (e.g., inviewWillDisappear
). - Check
ulimit -n
(increase if needed:ulimit -n 4096
).
- Ensure
- Verify both server and client are running simultaneously.
- Ensure UDP port 5005 isn’t blocked by a firewall.
- Add logging in
InterProcessCommunicator.receive(on:)
to debug packet reception.
- Add a delay in
server.py
(e.g.,cv2.waitKey(33)
for ~30 FPS). - Adjust
screen_size
indisplays.csv
to reduce data size.
Contributions are welcome! Please:
- Fork the repository.
- Create a feature branch (
git checkout -b feature/your-feature
). - Commit your changes (
git commit -m "Add your feature"
). - Push to the branch (
git push origin feature/your-feature
). - Open a pull request.
Ideas for improvement:
- Add TCP support as an alternative protocol.
- Implement error handling for network interruptions.
- Support dynamic frame rate control.
This project is unlicensed—use it freely at your own risk!