Skip to content

outputEncoding not handled correctly when using sRGB render target in WebGL2 #23251

@chubei-oppen

Description

@chubei-oppen

Describe the bug

In #23129, the outputEncoding is calculated differently from before. I believe the previous way is correct.

To Reproduce

See following code and fiddle.

Code

const renderer = new THREE.WebGLRenderer();
renderer.setSize( window.innerWidth, window.innerHeight );
renderer.outputEncoding = THREE.sRGBEncoding;
renderer.setPixelRatio( window.devicePixelRatio );
document.body.appendChild( renderer.domElement );

const geometry = new THREE.PlaneGeometry( 2, 2 );
const material = new THREE.MeshBasicMaterial();
const mesh = new THREE.Mesh( geometry, material );

const renderTarget = new THREE.WebGLRenderTarget( window.innerWidth, window.innerHeight, { encoding: THREE.sRGBEncoding } );

const camera = new THREE.OrthographicCamera();
camera.position.set( 0, 0, 1 );

new THREE.TextureLoader().load( 'https://threejs.org/examples/textures/crate.gif', (texture) => {

  texture.encoding = THREE.sRGBEncoding;

  material.map = texture;
  renderer.setRenderTarget( renderTarget );
  renderer.render( mesh, camera );

  material.map = renderTarget.texture;
  renderer.setRenderTarget( null );
  renderer.render( mesh, camera );

} );

Live example

Expected behavior

dev should behave the same as r136.

Screenshots

r136
image

dev
image

Platform:

  • Device: Desktop
  • OS: Linux
  • Browser: Chrome
  • Three.js version: dev

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions