Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ktxTextureLoader: Mark _useSRGBBuffers when loading an SRGB-enabled texture format #12362

Merged
merged 2 commits into from
Apr 13, 2022

Conversation

magcius
Copy link
Contributor

@magcius magcius commented Apr 7, 2022

Also set texture.format to the correct value,
since it seems the code elsewhere, like in the WebGPU engine, would
like a "base" texture format.

Completely untested patch to try and fix https://forum.babylonjs.com/t/compressed-ktx-texture-appears-too-dark/29060/4

@deltakosh deltakosh requested a review from bghgary April 7, 2022 21:32
@azure-pipelines
Copy link

Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags.

@magcius
Copy link
Contributor Author

magcius commented Apr 7, 2022

I don't believe I can select labels -- UI is treating them as read-only. Did I miss my chance?

@deltakosh deltakosh added the bug label Apr 7, 2022
@magcius magcius force-pushed the compressed-textures branch from a21ced5 to ed45b48 Compare April 7, 2022 22:00
@azure-pipelines
Copy link

Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags.

…exture format

Also set texture.format to the correct value,
since it seems the code elsewhere, like in the WebGPU engine, would
like a "base" texture format.
@magcius magcius force-pushed the compressed-textures branch from ed45b48 to 8d90974 Compare April 7, 2022 22:18
@azure-pipelines
Copy link

Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags.

@azure-pipelines
Copy link

Snapshot stored with reference name:
refs/pull/12362/merge

Test environment:
https://babylonsnapshots.z22.web.core.windows.net/refs/pull/12362/merge/index.html

To test a playground add it to the URL, for example:

https://babylonsnapshots.z22.web.core.windows.net/refs/pull/12362/merge/index.html#WGZLGJ#4600

To test the snapshot in the playground itself use (for example):

https://playground.babylonjs.com/?snapshot=refs/pull/12362/merge#BCU1XR#0

@bghgary
Copy link
Contributor

bghgary commented Apr 8, 2022

@magcius This code doesn't work because _useSRGBBuffers gets reset back to false here: https://github.com/BabylonJS/Babylon.js/blob/master/packages/dev/core/src/Engines/thinEngine.ts#L4485-L4486

@bghgary
Copy link
Contributor

bghgary commented Apr 8, 2022

I have a fix.

@magcius
Copy link
Contributor Author

magcius commented Apr 8, 2022

Ah, nice catch, thanks. I believe this would still fix the issue we were seeing with dark textures (the issue there is that the SRGB format and the shader fallback are both used), but ideally we'd be using the native SRGB hardware support instead of the shader fallback.

There is a similar bug in the KTX2 texture loader where it won't properly use SRGB buffers either, because it only sets texture.gammaTexture and not _useSRGBBuffer (it should set both, I believe).

My preferred fix for this would be to drop the glInternalFormat from the compressed texture upload function, and replace it with code that reconstructs the right texture format based on texture.format and _useSRGBBuffer, similar to how the webgpuEngine works. Does that sound good to you?

edit: nvm, OK, sounds good.

@bghgary
Copy link
Contributor

bghgary commented Apr 8, 2022

My local fix isn't quite right. There are still problems with it. I'm still checking.

@azure-pipelines
Copy link

Please make sure to tag your PR with "bug", "new feature" or "breaking change" tags.

@bghgary
Copy link
Contributor

bghgary commented Apr 11, 2022

@sebavan @deltakosh I've made some changes that makes this work for KTX 1. KTX 2 works but it doesn't use srgb buffers even if it's available. Other texture loaders (e.g. dds) don't seem to support srgb formats. Should we try to fix srgb formats more broadly?

@bghgary bghgary marked this pull request as ready for review April 11, 2022 23:53
@azure-pipelines
Copy link

Snapshot stored with reference name:
refs/pull/12362/merge

Test environment:
https://babylonsnapshots.z22.web.core.windows.net/refs/pull/12362/merge/index.html

To test a playground add it to the URL, for example:

https://babylonsnapshots.z22.web.core.windows.net/refs/pull/12362/merge/index.html#WGZLGJ#4600

To test the snapshot in the playground itself use (for example):

https://playground.babylonjs.com/?snapshot=refs/pull/12362/merge#BCU1XR#0

@magcius
Copy link
Contributor Author

magcius commented Apr 12, 2022

KTX2 doesn't use SRGB buffers because it only sets gammaTexture, not _useSRGBBuffers. I think a proper fix for that would be to decide the SRGB buffer-ness at texture creation time, like how the WebGPU backend works.

@bghgary
Copy link
Contributor

bghgary commented Apr 12, 2022

I think a proper fix for that would be to decide the SRGB buffer-ness at texture creation time, like how the WebGPU backend works.

Can you explain what you mean by this? I don't see any code that decides this flag at texture creation that is different than WebGL or when using texture loaders (e.g. KTX).

@magcius
Copy link
Contributor Author

magcius commented Apr 12, 2022

gpuTextureWrapper.format = WebGPUTextureHelper.GetWebGPUTextureFormat(texture.type, texture.format, texture._useSRGBBuffer);

This decides the final texture format at texture creation time.

The idea is we'd do something similar in the WebGL backend as well -- any time someone creates a texture with BC1 and _useSRGBBuffers is set to true, we'd use BC1_SRGB. Currently, _useSRGBBuffers just forces the format the other way -- if it's set to false, it will strip SRGB from the format.

So the path would be:

  • KTX1 has glInternalFormat = ETC2_SRGB. The KTX1 loader would split this into texture.format = ETC2 and _useSRGBBuffers = true
  • KTX2 has transcoded texture format (e.g. ETC2) and SRGB comes from the DTF flag. More straightforward: texture.format = ETC2 and _useSRGBBuffers = true

The texture creation code will merge both of these so even though texture.format is ETC2, the texture's internal format is ETC2_SRGB, and marked at texture creation time. In a WebGL2 context, we could even take the opportunity to call texStorage2D() here for a performance improvement.

Then we don't need any format during upload... it was fully decided at texture creation time.

@bghgary
Copy link
Contributor

bghgary commented Apr 13, 2022

The idea is we'd do something similar in the WebGL backend as well -- any time someone creates a texture with BC1 and _useSRGBBuffers is set to true, we'd use BC1_SRGB.

There is similar code for WebGL here for the compressed texture part:

public _uploadCompressedDataToTextureDirectly(
texture: InternalTexture,
internalFormat: number,
width: number,
height: number,
data: ArrayBufferView,
faceIndex: number = 0,
lod: number = 0
) {
const gl = this._gl;
let target = gl.TEXTURE_2D;
if (texture.isCube) {
target = gl.TEXTURE_CUBE_MAP_POSITIVE_X + faceIndex;
}
if (texture._useSRGBBuffer) {
switch (internalFormat) {
case Constants.TEXTUREFORMAT_COMPRESSED_RGBA_BPTC_UNORM:
internalFormat = gl.COMPRESSED_SRGB_ALPHA_BPTC_UNORM_EXT;
break;
case Constants.TEXTUREFORMAT_COMPRESSED_RGBA_ASTC_4x4:
internalFormat = gl.COMPRESSED_SRGB8_ALPHA8_ASTC_4x4_KHR;
break;
case Constants.TEXTUREFORMAT_COMPRESSED_RGB_S3TC_DXT1:
if (this._caps.s3tc_srgb) {
internalFormat = gl.COMPRESSED_SRGB_ALPHA_S3TC_DXT1_EXT;
} else {
// S3TC sRGB extension not supported
texture._useSRGBBuffer = false;
}
break;
case Constants.TEXTUREFORMAT_COMPRESSED_RGBA_S3TC_DXT5:
if (this._caps.s3tc_srgb) {
internalFormat = gl.COMPRESSED_SRGB_ALPHA_S3TC_DXT5_EXT;
} else {
// S3TC sRGB extension not supported
texture._useSRGBBuffer = false;
}
break;
default:
// We don't support a sRGB format corresponding to internalFormat, so revert to non sRGB format
texture._useSRGBBuffer = false;
break;
}
}
this._gl.compressedTexImage2D(target, lod, internalFormat, width, height, 0, <DataView>data);
}

Currently, _useSRGBBuffers just forces the format the other way -- if it's set to false, it will strip SRGB from the format.

Unless I missed something, that's not what happens in the function you referenced for WebGPU.

KTX1 has glInternalFormat = ETC2_SRGB. The KTX1 loader would split this into texture.format = ETC2 and _useSRGBBuffers = true

This is what is happening in your PR. I just fixed it so that it works. I think we should be good to go for KTX1.

KTX2 has transcoded texture format (e.g. ETC2) and SRGB comes from the DTF flag. More straightforward: texture.format = ETC2 and _useSRGBBuffers = true

Right. We can additionally fix this, but it should be a separate fix.

The texture creation code will merge both of these so even though texture.format is ETC2, the texture's internal format is ETC2_SRGB, and marked at texture creation time.

Just for a bit of context, the main reason this _useSRGBBuffers flag exists is because, at runtime, we may or may not have srgb buffers support. This support can be disabled in the engine capabilities or not available for WebGL1 when EXT_sRGB is not present. When srgb buffers are not supported, the engine will fall back to converting the color in the shader which is controlled by the _gammaSpace flag on the internal texture.

@magcius
Copy link
Contributor Author

magcius commented Apr 13, 2022

Sounds fine to me. Let's get this fix merged now and deal with the rest later.

@bghgary bghgary merged commit 43eaf41 into BabylonJS:master Apr 13, 2022
@magcius magcius deleted the compressed-textures branch April 13, 2022 20:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants