Skip to content

Quality degradation of AI denoising when albedo/normals/depth are backed by D3D12 resoruces #15

@BartSiwek

Description

@BartSiwek

While trying to integrate the RIF AI denoiser into the existing D3D12 pipeline I have stumbled upon the issue where if the albedo/normals/depth RIF images are backed by a D3D12 resource (created using the rifContextCreateImageFromDirectX12Memory) the resulting image degrades in quality. At the same time if RIF images are not resource backed (created using the rifContextCreateImage) and the same contents is copied into them via the rifImageMap/rifImageUnmap the quality is good.

Example of good quality output:
goof quality
Example of bad quality output (note the acne like artifacts on the bottom part of the ball and close to the plane edges):
bad quality

Notes:

  • The RIF image descriptors are exactly the same in both cases
  • In both cases the context is created using rifCreateContextFromDirectX12Context(RIF_API_VERSION, d3d12_device, d3d12_queue, nullptr, &context)
  • In both cases the queue is created using the rifContextCreateCommandQueue(context, &queue)
  • This issue does not apply to the "colorImg" filter parameter - as far as I noticed there in no quality degradation if this RIF image is D3D12 resource backed
  • This issue does not apply to the filter output - as far as I noticed there in no quality degradation if this RIF image is D3D12 resource backed
  • The type of the backing resource does not matter - it happens both with textures and buffers
  • The "useHDR" filter parameter is set to true

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions