-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ktxTextureLoader: Mark _useSRGBBuffers when loading an SRGB-enabled texture format #12362
Conversation
Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags. |
I don't believe I can select labels -- UI is treating them as read-only. Did I miss my chance? |
a21ced5
to
ed45b48
Compare
Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags. |
…exture format Also set texture.format to the correct value, since it seems the code elsewhere, like in the WebGPU engine, would like a "base" texture format.
ed45b48
to
8d90974
Compare
Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags. |
Snapshot stored with reference name: Test environment: To test a playground add it to the URL, for example: https://babylonsnapshots.z22.web.core.windows.net/refs/pull/12362/merge/index.html#WGZLGJ#4600 To test the snapshot in the playground itself use (for example): https://playground.babylonjs.com/?snapshot=refs/pull/12362/merge#BCU1XR#0 |
@magcius This code doesn't work because |
I have a fix. |
Ah, nice catch, thanks. I believe this would still fix the issue we were seeing with dark textures (the issue there is that the SRGB format and the shader fallback are both used), but ideally we'd be using the native SRGB hardware support instead of the shader fallback. There is a similar bug in the KTX2 texture loader where it won't properly use SRGB buffers either, because it only sets texture.gammaTexture and not _useSRGBBuffer (it should set both, I believe). My preferred fix for this would be to drop the glInternalFormat from the compressed texture upload function, and replace it with code that reconstructs the right texture format based on texture.format and _useSRGBBuffer, similar to how the webgpuEngine works. Does that sound good to you? edit: nvm, OK, sounds good. |
My local fix isn't quite right. There are still problems with it. I'm still checking. |
Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags. |
@sebavan @deltakosh I've made some changes that makes this work for KTX 1. KTX 2 works but it doesn't use srgb buffers even if it's available. Other texture loaders (e.g. dds) don't seem to support srgb formats. Should we try to fix srgb formats more broadly? |
Snapshot stored with reference name: Test environment: To test a playground add it to the URL, for example: https://babylonsnapshots.z22.web.core.windows.net/refs/pull/12362/merge/index.html#WGZLGJ#4600 To test the snapshot in the playground itself use (for example): https://playground.babylonjs.com/?snapshot=refs/pull/12362/merge#BCU1XR#0 |
KTX2 doesn't use SRGB buffers because it only sets gammaTexture, not _useSRGBBuffers. I think a proper fix for that would be to decide the SRGB buffer-ness at texture creation time, like how the WebGPU backend works. |
Can you explain what you mean by this? I don't see any code that decides this flag at texture creation that is different than WebGL or when using texture loaders (e.g. KTX). |
This decides the final texture format at texture creation time. The idea is we'd do something similar in the WebGL backend as well -- any time someone creates a texture with BC1 and _useSRGBBuffers is set to true, we'd use BC1_SRGB. Currently, _useSRGBBuffers just forces the format the other way -- if it's set to false, it will strip SRGB from the format. So the path would be:
The texture creation code will merge both of these so even though texture.format is ETC2, the texture's internal format is ETC2_SRGB, and marked at texture creation time. In a WebGL2 context, we could even take the opportunity to call texStorage2D() here for a performance improvement. Then we don't need any format during upload... it was fully decided at texture creation time. |
There is similar code for WebGL here for the compressed texture part: Babylon.js/packages/dev/core/src/Engines/thinEngine.ts Lines 4444 to 4492 in 81fd1dc
Unless I missed something, that's not what happens in the function you referenced for WebGPU.
This is what is happening in your PR. I just fixed it so that it works. I think we should be good to go for KTX1.
Right. We can additionally fix this, but it should be a separate fix.
Just for a bit of context, the main reason this _useSRGBBuffers flag exists is because, at runtime, we may or may not have srgb buffers support. This support can be disabled in the engine capabilities or not available for WebGL1 when EXT_sRGB is not present. When srgb buffers are not supported, the engine will fall back to converting the color in the shader which is controlled by the |
Sounds fine to me. Let's get this fix merged now and deal with the rest later. |
Also set texture.format to the correct value,
since it seems the code elsewhere, like in the WebGPU engine, would
like a "base" texture format.
Completely untested patch to try and fix https://forum.babylonjs.com/t/compressed-ktx-texture-appears-too-dark/29060/4