A GStreamer-based application that bridges WebRTC streams from WHEP (WebRTC HTTP Egress Protocol) endpoints to SRT (Secure Reliable Transport) output streams.
This tool consumes WebRTC media from a WHEP endpoint and re-streams it as SRT, enabling integration between WebRTC and SRT-based workflows. This is particularly useful for:
- Converting WebRTC streams to professional broadcast formats
- Integrating WebRTC sources into SRT-based production pipelines
- Low-latency streaming to SRT consumers
- Building bridges between web-based and broadcast infrastructure
- WHEP Input: Consumes WebRTC streams via the WHEP protocol
- SRT Output: Outputs to SRT with configurable parameters
- Audio Processing: Automatically handles audio decoding, conversion, and AAC encoding
- Multi-track Support: Handles multiple audio tracks via audio mixing (liveadder)
- Continuous Output: Silent audio source ensures continuous stream even without input
- Docker Support: Ready-to-use Docker image with all dependencies included
- Flexible Configuration: Supports both
whepsrc
andwhepclientsrc
implementations
- Rust (1.83+ recommended, using 2024 edition)
- GStreamer 1.24+ with the following plugins:
- gstreamer-plugins-base
- gstreamer-plugins-good
- gstreamer-plugins-bad
- gstreamer-plugins-ugly
- gstreamer-libav
- gstreamer-nice
- Development libraries:
- libssl-dev
- libgstreamer1.0-dev
- libgstreamer-plugins-base1.0-dev
- libgstreamer-plugins-bad1.0-dev
This project requires GStreamer Rust plugins from gst-plugins-rs:
gst-plugin-webrtc
(provideswhepclientsrc
with WHEP feature)
Note: The Cargo.toml
currently uses a git dependency pinned to a specific commit SHA (e136005b108ec85bdc8bc533c551f56ef978e950
) because the WHEP signaller feature is not yet available in the official crate release. This will be updated to use the published crate once the feature is available in the next official release.
# Clone the repository
git clone <repository-url>
cd whep-srt
# Build the project
cargo build --release
# The binary will be at target/release/whep-srt
# Build the Docker image
docker build -t whep-srt .
# Run the container
docker run -it whep-srt -i <WHEP_ENDPOINT_URL> -o <SRT_OUTPUT_URL>
./whep-srt -i <WHEP_INPUT_URL> -o <SRT_OUTPUT_URL>
Option | Description | Default |
---|---|---|
-i, --input-url |
WHEP source URL (required) | - |
-o, --output-url |
SRT output stream URL | srt://0.0.0.0:1234?mode=listener |
--dot-debug |
Output debug .dot files of the pipeline | false |
Listen for SRT connections on port 1234 (default):
./whep-srt -i http://localhost:8889/mystream/whep
Push to a specific SRT destination:
./whep-srt -i http://localhost:8889/mystream/whep \
-o "srt://192.168.1.100:5000?mode=caller"
Using Docker with port mapping:
docker run -p 1234:1234/udp whep-srt \
-i http://host.docker.internal:8889/mystream/whep \
-o "srt://0.0.0.0:1234?mode=listener"
Running the included debug script:
# Edit run.sh to configure your WHEP endpoint
./run.sh
The application dynamically constructs a GStreamer pipeline that:
- WHEP Source: Connects to the WHEP endpoint using
whepsrc
orwhepclientsrc
(configurable) - Dynamic Pad Handling: Detects and handles audio/video tracks as they become available
- Audio Processing Chain:
- Decodes incoming audio tracks using
decodebin
- Converts audio to F32LE format at 48kHz
- Mixes multiple audio tracks using
liveadder
- Adds a silent audio test source to ensure continuous output
- Encodes to AAC using
avenc_aac
- Decodes incoming audio tracks using
- Output Chain:
- Muxes audio into MPEG-TS using
mpegtsmux
- Sends to SRT destination via
srtsink
- Muxes audio into MPEG-TS using
Pipeline String (when using whepsrc):
whepsrc → [dynamic audio pads] → decodebin → audioconvert → audioresample →
capsfilter → liveadder ← audiotestsrc (silence) → avenc_aac → aacparse →
mpegtsmux → queue → srtsink
Pipeline String (when using whepclientsrc):
whepclientsrc → [dynamic audio pads] → decodebin → audioconvert → audioresample →
capsfilter → liveadder ← audiotestsrc (silence) → avenc_aac → aacparse →
mpegtsmux → queue → srtsink
The SRT output URL supports standard SRT URI parameters:
mode=listener
- Wait for incoming connections (default)mode=caller
- Connect to a remote SRT receiverlatency=<ms>
- Set SRT latency buffer (default: 100ms)- Additional parameters supported by GStreamer's srtsink element
The application supports two WHEP source implementations (configurable in src/main.rs:56):
- whepclientsrc (currently enabled) - From
gst-plugin-webrtc
- Newer implementation using signaller interface (will eventually replace whepsrc) - whepsrc - From
gst-plugin-webrtchttp
- Original WebRTC implementation based on webrtcbin
Toggle between them by changing the whepsrc
boolean variable in the code. Note: whepclientsrc
requires the plugin to be registered via gstrswebrtc::plugin_register_static()
as shown in src/main.rs:65.
Audio Input (via RTP):
- OPUS (default, 48kHz)
Video Input (via RTP):
- VP8, VP9 (default)
- H.264, H.265
- AV1
Note: Video tracks are currently sent to fakesink
and not included in SRT output.
Enable GStreamer debug output using environment variables:
# Show all debug output
GST_DEBUG=*:DEBUG ./whep-srt -i <WHEP_URL>
# Show WHEP-specific debug output
GST_DEBUG=*whep*:DEBUG ./whep-srt -i <WHEP_URL>
# Save debug log to file
GST_DEBUG_FILE=debug.log GST_DEBUG=*:DEBUG ./whep-srt -i <WHEP_URL>
# Generate pipeline visualization (DOT files) using the --dot-debug flag
./whep-srt -i <WHEP_URL> --dot-debug
# Or set the environment variable directly
GST_DEBUG_DUMP_DOT_DIR=./ ./whep-srt -i <WHEP_URL>
The application automatically generates GraphViz DOT files of the pipeline on state changes and errors when the --dot-debug
flag is used. The files are timestamped with the format <epoch>-<state>.dot
(e.g., 1729000000-Playing.dot
, 1729000000-error.dot
). Convert them to SVG for visualization:
# Convert a DOT file to SVG
dot -Tsvg 1729000000-error.dot -o pipeline.svg
# Or use xdot for interactive viewing
xdot 1729000000-error.dot
- src/main.rs - Main application logic
- Command-line argument parsing (Args struct)
- Pipeline construction and management
- Dynamic pad handling for audio/video tracks
- Event loop and error handling
- Debug pipeline visualization (debug_pipeline function)
- Video handling: Video tracks are currently discarded (sent to
fakesink
) - Git dependency: Uses pinned git commit for
gst-plugin-webrtc
until WHEP feature is available in published crate - Audio-only output: Only audio is currently muxed to SRT output
Missing GStreamer elements: If you get errors about missing elements, ensure all required GStreamer plugins are installed:
gst-inspect-1.0 whepclientsrc
gst-inspect-1.0 srtsink
gst-inspect-1.0 avenc_aac
SRT connection issues: Check your firewall settings and ensure the SRT port (default 1234/udp) is accessible.
No audio output: Enable debug logging to see if audio pads are being created and linked correctly.
-
Add video support to SRT output
-
Support for published crates.io versions of gst-plugins-rs
See the LICENSE file for details.
Per Enstedt <per.enstedt@eyevinn.se>
Developed at Eyevinn Technology
Contributions are welcome! Please feel free to submit issues or pull requests.
- GStreamer - Multimedia framework
- gst-plugins-rs - GStreamer plugins written in Rust
- WHEP Specification - WebRTC HTTP Egress Protocol
- SRT Alliance - Secure Reliable Transport protocol