The proj is based on cloud rendering.(云渲染)
For the detailed enviroment setup, please refer to SETUP.md. Ubuntu and macos is recommended. And please note that the server file is from libdatachannel repo.
git submodule update --init --recursive
cmake -B cmake-build-debug
cd cmake-build-debug
make
before running the program, please make sure the server is set up.
cd client
python3 signaling-server.py
python3 -m http.server --bind 127.0.0.1 8080
then just run the program ./ors
, and open http://127.0.0.1:8080/ in your browser, and press start button.
Parameter | Function |
---|---|
-docker | Use in docker mode |
-d | Dump video to check video |
-gpu num | Use hardware for accelerate, e.g. -gpu 4 is videotoolbox. (0-cpu, 1-nvenc, 2-vaapi, 3-qsv, 4-videotoolbox) |
Lots of work still need to be done.
- Record OpenGL app screen and encode to H.264
- Rtmp Streamer, which can push the H.264 raw frame in buffer to server
- Webrtc
- Terminal controls transfer to server
- Use librtc(libdatachannel is the origin name) to send memory video.
- Enable browser control.
- Accelerate encode.(partial)
- Integrate with webgl rendering.
- Task scheduling on local and remote machine
Please note: the repo's license is MIT, but the 3rd_party/librtc is GPL.