Description
PR #44 handles the index
attribute on C-like enums, but does not address regular enums with some variants marked #[codec(index = $int)]
.
All enum variants have a discriminant. C-style enums allow explicitly setting the discriminant. Regular enums do not.
The #[codec(index = 123)]
attribute sets the index byte for the variant it decorates when the enum is SCALE encoded.
"index" and "discriminant" do not mean the same thing, but the current implementation conflates them somewhat and the default "index" for a variant is, in order of priority, the #[codec(index = …)]
attribute, the explicitly set discriminant, and lastly the implicit index from the order of appearance.
I believe parity-scale-codec
is slightly broken today, in that the following will lead to bad things happening when encoding/decoding:
#[derive(Encode)]
enum Foo {
A,
#[codec(index = 2)]
B = 8,
C
}
The Variant
struct in scale-info
has a discriminant
member, an Option<u64>
. Today it is None
for all non C-style enums.
Picking the right solution here can be a bit tricky. We can use the current discriminant
field to store the index. This is a questionable solution though, as we perpetuate the mixing of index and discriminant. Open to alternative ideas.