Skip to content

Type inference failure involving binary operators, traits, references, and integer defaulting #36549

Closed
@solson

Description

@solson

We expect this to compile without error (reasoning below):

let c: &u32 = &5; ((c >> 8) & 0xff) as u8

Compile error:

error: the trait bound `u32: std::ops::BitAnd<i32>` is not satisfied [--explain E0277]
  --> <anon>:10:27
   |>
10 |>         let c: &u32 = &5; ((c >> 8) & 0xff) as u8
   |>                           ^^^^^^^^^^^^^^^^^

Problem: The 0xff is getting inferred to i32 instead of u32, which would work.

Explanation of my understanding of the code:

  1. c has type &u32
  2. 8 is defaulted to i32 (perhaps 0xff is also unhelpfully defaulted to i32 at this stage?)
  3. c >> 8 has type u32 via impl<'a> Shr<i32> for &'a u32 (to the best of my knowledge)
  4. Next we expect 0xff to infer to u32 to use the u32: BitAnd<u32> impl, but it fails.

Working examples (each with a slight twist):

// The most obvious fix is to force the type of 0xff:
let c: &u32 = &5; ((c >> 8) & 0xffu32) as u8

// But that shouldn't be necessary! It works without that if `c: u32` or `*c` is used:
let c: u32 = 5; ((c >> 8) & 0xff) as u8
let c: &u32 = &5; ((*c >> 8) & 0xff) as u8

// Finally, and most oddly, using an identity cast or type ascription
// from `u32` to `u32` also convinces the inference engine:
let c: &u32 = &5; ((c >> 8) as u32 & 0xff) as u8
let c: &u32 = &5; ((c >> 8): u32 & 0xff) as u8

Who thought identity casts were useless?

cc @retep998 @nagisa

Metadata

Metadata

Assignees

No one assigned

    Labels

    A-inferenceArea: Type inferenceC-bugCategory: This is a bug.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions