Closed
Description
We expect this to compile without error (reasoning below):
let c: &u32 = &5; ((c >> 8) & 0xff) as u8
Compile error:
error: the trait bound `u32: std::ops::BitAnd<i32>` is not satisfied [--explain E0277]
--> <anon>:10:27
|>
10 |> let c: &u32 = &5; ((c >> 8) & 0xff) as u8
|> ^^^^^^^^^^^^^^^^^
Problem: The 0xff
is getting inferred to i32
instead of u32
, which would work.
Explanation of my understanding of the code:
c
has type&u32
8
is defaulted toi32
(perhaps0xff
is also unhelpfully defaulted toi32
at this stage?)c >> 8
has typeu32
viaimpl<'a> Shr<i32> for &'a u32
(to the best of my knowledge)- Next we expect
0xff
to infer tou32
to use theu32: BitAnd<u32>
impl, but it fails.
Working examples (each with a slight twist):
// The most obvious fix is to force the type of 0xff:
let c: &u32 = &5; ((c >> 8) & 0xffu32) as u8
// But that shouldn't be necessary! It works without that if `c: u32` or `*c` is used:
let c: u32 = 5; ((c >> 8) & 0xff) as u8
let c: &u32 = &5; ((*c >> 8) & 0xff) as u8
// Finally, and most oddly, using an identity cast or type ascription
// from `u32` to `u32` also convinces the inference engine:
let c: &u32 = &5; ((c >> 8) as u32 & 0xff) as u8
let c: &u32 = &5; ((c >> 8): u32 & 0xff) as u8
Who thought identity casts were useless?