-
Notifications
You must be signed in to change notification settings - Fork 13
Open
Description
While trying to integrate the RIF AI denoiser into the existing D3D12 pipeline I have stumbled upon the issue where if the albedo/normals/depth RIF images are backed by a D3D12 resource (created using the rifContextCreateImageFromDirectX12Memory) the resulting image degrades in quality. At the same time if RIF images are not resource backed (created using the rifContextCreateImage) and the same contents is copied into them via the rifImageMap/rifImageUnmap the quality is good.
Example of good quality output:

Example of bad quality output (note the acne like artifacts on the bottom part of the ball and close to the plane edges):

Notes:
- The RIF image descriptors are exactly the same in both cases
- In both cases the context is created using
rifCreateContextFromDirectX12Context(RIF_API_VERSION, d3d12_device, d3d12_queue, nullptr, &context) - In both cases the queue is created using the
rifContextCreateCommandQueue(context, &queue) - This issue does not apply to the "colorImg" filter parameter - as far as I noticed there in no quality degradation if this RIF image is D3D12 resource backed
- This issue does not apply to the filter output - as far as I noticed there in no quality degradation if this RIF image is D3D12 resource backed
- The type of the backing resource does not matter - it happens both with textures and buffers
- The "useHDR" filter parameter is set to
true
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels