Closed
Description
I am working on some Rust bindings for RenderDoc, and I am trying to replace my hand-rolled FFI code with bindgen. However, I am running into an issue with incorrect and inconsistent enum representations between Windows and Unix-likes, where on Windows, the resulting code doesn't even compile when integrated into my bindings.
The bug is caused due to a single line in the input code below, and appears to be tangentially related to #1185.
Input C/C++ Header
typedef enum {
eRENDERDOC_Overlay_Enabled = 0x1,
eRENDERDOC_Overlay_FrameRate = 0x2,
eRENDERDOC_Overlay_FrameNumber = 0x4,
eRENDERDOC_Overlay_CaptureList = 0x8,
eRENDERDOC_Overlay_Default = (eRENDERDOC_Overlay_Enabled | eRENDERDOC_Overlay_FrameRate |
eRENDERDOC_Overlay_FrameNumber | eRENDERDOC_Overlay_CaptureList),
eRENDERDOC_Overlay_All = ~0U, // <-- This is the culprit!
eRENDERDOC_Overlay_None = 0,
} RENDERDOC_OverlayBits;
typedef uint32_t(RENDERDOC_CC *pRENDERDOC_GetOverlayBits)();
Bindgen Invocation
bindgen::Builder::default()
.header("input.h")
.generate()
.unwrap()
Actual Results
On Unix, bindgen correctly outputs:
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_Enabled: RENDERDOC_OverlayBits = 1;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_FrameRate: RENDERDOC_OverlayBits = 2;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_FrameNumber: RENDERDOC_OverlayBits = 4;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_CaptureList: RENDERDOC_OverlayBits = 8;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_Default: RENDERDOC_OverlayBits = 15;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_All: RENDERDOC_OverlayBits = 4294967295; // Technically correct value, but literal is out of range.
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_None: RENDERDOC_OverlayBits = 0;
pub type RENDERDOC_OverlayBits = u32; // Correct type.
pub type pRENDERDOC_GetOverlayBits = ::std::option::Option<unsafe extern "C" fn() -> u32>;
When cross-compiling on Unix to Windows, bindgen incorrectly outputs:
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_Enabled: RENDERDOC_OverlayBits = 1;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_FrameRate: RENDERDOC_OverlayBits = 2;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_FrameNumber: RENDERDOC_OverlayBits = 4;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_CaptureList: RENDERDOC_OverlayBits = 8;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_Default: RENDERDOC_OverlayBits = 15;
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_All: RENDERDOC_OverlayBits = -1; // Incorrect value!
pub const RENDERDOC_OverlayBits_eRENDERDOC_Overlay_None: RENDERDOC_OverlayBits = 0;
pub type RENDERDOC_OverlayBits = i32; // Incorrect type!
pub type pRENDERDOC_GetOverlayBits = ::std::option::Option<unsafe extern "C" fn() -> u32>;
Expected Results
The correct enum representation in this case should have been uint32_t
on both platforms. I would like to see better support for ~0U
/u32::MAX
on the bindgen side of things.