Description
Currently large sets of integers are grouped under a single Kind. For example, all signed integers are K"Integer"
because the tokenizer rules cannot distinguish between different integers without numeric conversion.
However this seems semantically awkward when the actual value of a K"Integer"
can be any of the types Int32
, Int64
, Int128
or BigInt
.
We already need to perform numeric conversion to parse and validate floating point numbers - perhaps we should also do so for other numeric literals, and set their kinds appropriately?
However there's an awkward product here between the converted numeric type and the textural representation. For example a hex integer will be converted to any of the types UInt8, UInt16, UInt32, UInt64, UInt128, BigInt. The same is true of binary and octal text representations. So we have all possible combinations from the product of
- Textural representations HexInt, BinInt, OctInt
- Value representations UInt8, UInt16, UInt32, UInt64, UInt128, BigInt
At the level of the green tree, the textual representation is probably what's most important. Later during compilation the exact Julia type becomes more important.