You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Alternatively, you can build SRS from its source code.
cd~/git
git clone -b develop https://github.com/ossrs/srs.git
cd srs/trunk
./configure
make
# After building SRS, you may run it using a configuration file.cd~/git/srs/trunk
./objs/srs -c conf/rtc2rtmp.conf
Note: Please upgrade to SRS version 5.0.153 or higher, or 6.0.43 or higher.
To download the code and build FFmpeg, you can use the following command.
cd~/git
git clone -b master https://github.com/ossrs/ffmpeg-webrtc.git
cd ffmpeg-webrtc
./configure --enable-muxer=whip --enable-openssl --enable-version3 \
--enable-libx264 --enable-gpl --enable-libopus
make -j
Note: To enable DTLS handshake, OpenSSL is mandatory. Please install OpenSSL, for instance, brew install openssl, and then configure the environment by running export PKG_CONFIG_PATH="/usr/local/opt/openssl@3/lib/pkgconfig".
Note: For demonstration purposes, you can install libx264 by running brew install x264 and libopus by running brew install opus.
Although WebRTC has the capability to support x264 main and high profiles without B frames, it is advisable to use the baseline profile for better compatibility. If your stream doesn't have these codecs, you can transcode it using FFmpeg.
If you cannot use libx264, you need to copy the video stream from the file and encode the audio stream to Opus. So, you need to prepare a video file with baseline video and Opus audio. Note that you need another instance of FFmpeg with support for libx264 and libopus. Then, you are able to copy the source file as a WHIP stream without encoding; the file is streamed directly because it already uses the Baseline H.264 profile and Opus codec.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
To enable FFmpeg to publish a stream, SRS can be used as the WHIP server. It is recommended to use docker.
Alternatively, you can build SRS from its source code.
To download the code and build FFmpeg, you can use the following command.
Although WebRTC has the capability to support x264 main and high profiles without B frames, it is advisable to use the baseline profile for better compatibility. If your stream doesn't have these codecs, you can transcode it using FFmpeg.
If you cannot use libx264, you need to copy the video stream from the file and encode the audio stream to Opus. So, you need to prepare a video file with baseline video and Opus audio. Note that you need another instance of FFmpeg with support for libx264 and libopus. Then, you are able to copy the source file as a WHIP stream without encoding; the file is streamed directly because it already uses the Baseline H.264 profile and Opus codec.
After publishing stream to SRS, you can play the WHIP stream in web browser such as Chrome, using srs-player.
The image below shows that the latency is around 150ms.
The RTMP, HTTP-FLV, or HTTP-TS stream remuxed by SRS can be played using ffplay, VLC, or srs-player.
Beta Was this translation helpful? Give feedback.
All reactions