-
Notifications
You must be signed in to change notification settings - Fork 45
Use shader for YUV to RGB conversion on received frames #168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
chenosaurus
commented
Nov 7, 2025
- replace slow CPU conversion step to using a GPU shader
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces GPU-accelerated YUV to RGB video frame conversion for LiveKit's Unity SDK. The change replaces the previous CPU-based RGBA conversion with an optimized shader-based approach that processes YUV420 planar data directly on the GPU.
Key changes:
- Added YUV to RGB conversion shader using BT.709 limited range color space conversion
- Refactored VideoStream to support both GPU and CPU conversion paths with fallback mechanism
- Changed output texture from Texture2D to RenderTexture to support GPU rendering pipeline
Reviewed Changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| Runtime/Shaders/YuvToRgb.shader | New HLSL shader implementing BT.709 YUV to RGB color space conversion |
| Runtime/Shaders/YuvToRgb.shader.meta | Unity metadata for shader asset |
| Runtime/Shaders.meta | Unity metadata for Shaders folder |
| Runtime/Scripts/VideoStream.cs | Refactored to use RenderTexture with GPU shader conversion and CPU fallback |
| LICENSE.txt.meta | Unity metadata for license file |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Copilot reviewed 8 out of 8 changed files in this pull request and generated 4 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| HLSLPROGRAM | ||
| #pragma vertex vert | ||
| #pragma fragment frag | ||
| #include "UnityCG.cginc" |
Copilot
AI
Nov 7, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The shader uses HLSLPROGRAM/ENDHLSL but includes "UnityCG.cginc", which is a CG/HLSL include file. While this may work in Unity, the include should typically be "UnityShaderVariables.cginc" or a pure HLSL include when using HLSLPROGRAM. Consider using CGPROGRAM/ENDCG instead of HLSLPROGRAM/ENDHLSL for better compatibility with UnityCG.cginc, or switch to proper HLSL includes if using HLSLPROGRAM.
| #include "UnityCG.cginc" | |
| #include "UnityShaderVariables.cginc" |
ladvoc
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On-device testing was successful, just a few minor comments.
| { | ||
| public delegate void FrameReceiveDelegate(VideoFrame frame); | ||
| public delegate void TextureReceiveDelegate(Texture2D tex2d); | ||
| public delegate void TextureReceiveDelegate(Texture tex); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this constitutes a minor breaking API change—are we able to use Texture2D here? If not, we can note this in the release notes.
| private VideoStreamInfo _info; | ||
| private bool _disposed = false; | ||
| private bool _dirty = false; | ||
| private bool _useGpuShader = true; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think we should expose an API to control this setting?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe I should just remove this, not sure if we will encounter any use cases where the system doesn't have some sort of GPU.
| return o; | ||
| } | ||
|
|
||
| float3 yuvToRgb709Limited(float y, float u, float v) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe you should be able to use half, that will save and boost the performance more.
Assuming you are using the legacy rendering pipeline:
inline half3 YUV709Limited_to_RGB(half y, half u, half v)
{
half c = y - half(16.0/255.0);
half d = u - half(128.0/255.0);
half e = v - half(128.0/255.0);
half Y = half(1.16438356) * c;
half3 rgb;
rgb.r = Y + half(1.79274107) * e;
rgb.g = Y - half(0.21324861) * d - half(0.53290933) * e;
rgb.b = Y + half(2.11240179) * d;
return saturate(rgb);
}
half4 frag(v2f i):SV_Target
{
half y = tex2D(_TexY, i.uv).r;
half u = tex2D(_TexU, i.uv).r; // U,V textures are W/2 x H/2
half v = tex2D(_TexV, i.uv).r;
return half4(YUV709Limited_to_RGB(y,u,v), 1.0h);
}
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>