Description
I started attaching PresentMon 2.0 to RTSS and faced some weird issue with GPUWait/GPULatency metrics reporting abnormally high values under certain conditions. Inside my application I have both PresentMon service and PresentMon console application based data importing modes. When I import those metrics via console PresentMon 2.0 everything works as expected. However an attempt to read GPUWait/GPULatency via PresentMon 2.0 service and PresentMonAPI2.dll directly results in returning bogus data inside GPUWait/GPULatency sometimes, mostly under high framerate scenarios. I've created tiny console application which just spawns Text3D sample from old D3D9 SDK, which runs at 3-5K FPS framerate and triggers the issue pretty fast on my system:
Considering that in case of error it reports abnormally high values for both GPUWait/GPULatency and both of them are calculated using GPUStart timestamp difference, I'd assume that wrong value is possibly reported if this timestamp is uninitialized / zeroed due to some reason.
What really puzzles me that I cannot reproduce that with PresentMon's own overlay, which also seem to communicate with service via PresentMonAPI2.dll. Probably in raw mode some additional filtering is still applied to data prior to displaying it?