Skip to content

bug: parsing big integers in JSON data loses precision #489

Closed
@lance

Description

@lance

Describe the Bug
When parsing JSON data, if a JSON field value is a number, and that number is really big, JavaScript loses precision. For example, the Twitter API exposes the Tweet ID. This is a large number that exceeds the integer space of Number. You can see this by simply assigning a large number to a simple variable (it's not actually JSON that is the problem, it's JavaScript itself).

> let id = 1524831183200260097
undefined
> id
1524831183200260000

Steps to Reproduce

❯ npm install cloudevents
❯ cat > index.js
const { CloudEvent } = require('cloudevents')

let e = new CloudEvent({ source: 'example', type: 'example', data: 1524831183200260097 })
console.log(e.data)
^D
❯ node index.js
1524831183200260000

Expected Behavior
I expect the data to not lose precision.

Additional context
This can be resolved by using json-bigint

> let JSON = require('json-bigint')({useNativeBigInt: true})
> let j = JSON.parse('{ "key": 993143214321423154315154321 }')
> j.key
993143214321423154315154321

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions