Description
This relates a bit to #90
The specification uses number
pretty much everywhere. For example:
interface ProtocolMessage {
/**
* Sequence number of the message (also known as message ID). The `seq` for
* the first message sent by a client or debug adapter is 1, and for each
* subsequent message is 1 greater than the previous message sent by that
* actor. `seq` can be used to order requests, responses, and events, and to
* associate requests with their corresponding responses. For protocol
* messages of type `request` the sequence number can be used to cancel the
* request.
*/
seq: number;
/**
* Message type.
* Values: 'request', 'response', 'event', etc.
*/
type: 'request' | 'response' | 'event' | string;
}
But it doesn't define number
.
In some languages number
refers to either a float or integer.
The JSON schema uses integer
in the same place. But also doesn't define a minimum or maximum. (As far as I could find, JSON schema doesn't define a limit for integer
either, it would have to be set via maximum
)
There are a few places where the length is explicitly pointed out:
interface RunInTerminalResponse extends Response {
body: {
/**
* The process ID. The value should be less than or equal to 2147483647
* (2^31-1).
*/
processId?: number;
/**
* The process ID of the terminal shell. The value should be less than or
* equal to 2147483647 (2^31-1).
*/
shellProcessId?: number;
};
}
number
without any upper bound is problematic because languages/libraries can choose to serialize them differently:
Numbers in JSON are agnostic with regard to their representation within programming languages. While this allows for numbers of arbitrary precision to be serialized, it may lead to portability issues. For example, since no differentiation is made between integer and floating-point values, some implementations may treat 42, 42.0, and 4.2E+1 as the same number, while others may not. The JSON standard makes no requirements regarding implementation details such as overflow, underflow, loss of precision, rounding, or signed zeros, but it does recommend expecting no more than IEEE 754 binary64 precision for "good interoperability"
Concrete example:
The cjson library in Neovim converts largish numbers to scientific notation, leading to precision loss:
print(vim.json.encode({big_int = 3053700806959403}))
-- {"big_int":3.0537008069594e+15}
print $ (decode "{\"threadId\": 3053700806959403}" :: Maybe Foo)
-- Just (Foo {threadId = 3053700806959403})
print $ (decode "{\"threadId\": 3.0537008069594e+15}" :: Maybe Foo)
-- Just (Foo {threadId = 3053700806959400})
This behavior is within the JSON specification but currently breaks the dart debug adapter.
Could the specification clarify the allowed range of numbers and also clarify to which precision it must serialize them without falling back to scientific notation?
Maybe similar to the LSP spec:
/**
* Defines an integer number in the range of -2^31 to 2^31 - 1.
*/
export type integer = number;