Description
I would like to reopen the discussion because I strongly disagree with the current implementation.
Relevant issues: #107, #266, #271, #296, #301
By default, I think any PG type which maps directly to a Javascript type should be converted to that type. If there isn't a direct mapping, then simply returning a string is fine. Of course, the user can override the type parser if they have different needs. This is how node-mysql handles type parsing which I believe to be correct.
What is the problem? Right now all decimal numbers are returned as strings, and 64-bit integers are parsed as integers. Some of the former have a direct mapping to a Javascript type, and the latter loses precision.
The Postgres docs specify that real
and double precision
are "implementations of IEEE Standard 754 for Binary Floating-Point Arithmetic (single and double precision, respectively)". Javascript only has one number type: IEEE Standard 754 double precision. All Javascript numbers whether they are integers or decimals are converted to IEEE Standard 754 double precision.
This means that parseFloat()
can safely be used for real
and double precision
. Using strings is confusing to the user since it is not expected and requires them to do more work by overriding the type parser without any real benefit.
Javascript number representation causes a problem for numeric/decimal
, bigint
, bigserial
since there could be a loss of precision. This is a case where a string is appropriate. All other PG number types can be represented properly with a Javascript number without a loss of precision.
Below shows what I think to be the correct, default parsing.
smallint: parseInt()
integer: parseInt()
bigint: string
decimal: string
numeric: string
real: parseFloat()
double precision: parseFloat()
smallserial: parseInt()
serial: parseInt()
bigserial: string