Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Convertion fails for columns with datetime64[ms] #57738

Open
3 tasks done
alvarofagner opened this issue Mar 5, 2024 · 3 comments
Open
3 tasks done

BUG: Convertion fails for columns with datetime64[ms] #57738

alvarofagner opened this issue Mar 5, 2024 · 3 comments
Labels
Bug IO JSON read_json, to_json, json_normalize Non-Nano datetime64/timedelta64 with non-nanosecond resolution

Comments

@alvarofagner
Copy link

Pandas version checks

  • I have checked that this issue has not already been reported.

  • I have confirmed this bug exists on the latest version of pandas.

  • I have confirmed this bug exists on the main branch of pandas.

Reproducible Example

import pandas as pd

if __name__ == "__main__":
    df = pd.DataFrame(
        [
            ["2023-09-29 02:55:54"],
            ["2023-09-29 02:56:03"],
        ],
        columns=["timestamp"],
        dtype="datetime64[ms]",
    )

    serialized = df.to_json()
    print(serialized)
    # Got: {"timestamp":{"0":1695956,"1":1695956}}
    # Should be: {"timestamp":{"0":1695956154000,"1":1695956163000}}
    deserialized = pd.read_json(serialized, convert_dates=["timestamp"])
    print(pd.to_datetime(deserialized["timestamp"], unit="ms"))
    # Got:
    # 0   1970-01-01 00:28:15.956
    # 1   1970-01-01 00:28:15.956
    # Instead of:
    # 0   2023-09-29 02:55:54
    # 1   2023-09-29 02:56:03

Issue Description

When a dataframe contains a column which dtype is datetime64[ms] and one tries to convert it to json using df.to_json() the data does not correspond to the correct value. So trying to convert it back will give the default timestamp.
See example.

Expected Behavior

The json values for the timestamp should be the correspoinding one for the given date string so it we can restore the correct date/time.
It works when using datetime64[ns].

Installed Versions

INSTALLED VERSIONS

commit : bdc79c1
python : 3.12.1.final.0
python-bits : 64
OS : Linux
OS-release : 5.4.0-150-generic
Version : #167~18.04.1-Ubuntu SMP Wed May 24 00:51:42 UTC 2023
machine : x86_64
processor :
byteorder : little
LC_ALL : None
LANG : en_US.UTF-8
LOCALE : en_US.UTF-8

pandas : 2.2.1
numpy : 1.26.4
pytz : 2024.1
dateutil : 2.8.2
setuptools : 69.1.1
pip : 23.3.1
Cython : None
pytest : 8.0.2
hypothesis : None
sphinx : None
blosc : None
feather : None
xlsxwriter : None
lxml.etree : None
html5lib : None
pymysql : None
psycopg2 : 2.9.9
jinja2 : 3.1.3
IPython : None
pandas_datareader : None
adbc-driver-postgresql: None
adbc-driver-sqlite : None
bs4 : None
bottleneck : None
dataframe-api-compat : None
fastparquet : None
fsspec : None
gcsfs : None
matplotlib : None
numba : None
numexpr : None
odfpy : None
openpyxl : 3.1.2
pandas_gbq : None
pyarrow : 15.0.0
pyreadstat : None
python-calamine : None
pyxlsb : None
s3fs : None
scipy : None
sqlalchemy : 2.0.27
tables : None
tabulate : None
xarray : None
xlrd : None
zstandard : 0.22.0
tzdata : 2024.1
qtpy : None
pyqt5 : None
None

@alvarofagner alvarofagner added Bug Needs Triage Issue that has not been reviewed by a pandas team member labels Mar 5, 2024
@jbrockmendel jbrockmendel added the IO JSON read_json, to_json, json_normalize label Mar 5, 2024
@jbrockmendel
Copy link
Member

jbrockmendel commented Mar 5, 2024

xref #55827

@Ricardus312
Copy link

I encountered this same problem with the to_datetime64() function when upgrading to version 2.2

According to the documentation, until version 2.0.3 the pandas.Timestamp.to_datetime64 function returned "a numpy.datetime64 object with 'ns' precision". Since version 2.1.4 that function returns "a numpy.datetime64 object with same precision".

This change has broken critical parts of my code that assumed conversion with nanosecond precision, and now depending on the decimals of the argument passed the behavior of the function is unpredictable.

I recommend downgrading to an older version of pandas to resolve it.

@lithomas1
Copy link
Member

If you specify date_format="iso" in the to_json call, the round-tripping happens successfully.

(Note: the dtype will still be datetime64[ns], though for stuff in range for datetime64ns. I don't know if we should be reading stuff that's in-bounds for datetime64[ns] as non-nano)

It looks like we can do better with preserving a dtype for non-nano datetimes, though.

Running

import pandas as pd

if __name__ == "__main__":
    df = pd.DataFrame(
        [
            ["1000-09-29 02:55:54"],
            ["1000-09-29 02:56:03"],
        ],
        columns=["timestamp"],
        dtype="datetime64[ms]",
    )
    import io
    serialized = df.to_json(date_format="iso")
    print(serialized)
    deserialized = pd.read_json(io.StringIO(serialized), convert_dates=["timestamp"])
    print(deserialized)
    print(deserialized.dtypes)

I get object dtype for the deserialized json timestamp column

Output

{"timestamp":{"0":"1000-09-29T02:55:54.000","1":"1000-09-29T02:56:03.000"}}
                 timestamp
0  1000-09-29T02:55:54.000
1  1000-09-29T02:56:03.000
timestamp    object
dtype: object

@lithomas1 lithomas1 added Non-Nano datetime64/timedelta64 with non-nanosecond resolution and removed Needs Triage Issue that has not been reviewed by a pandas team member labels Mar 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug IO JSON read_json, to_json, json_normalize Non-Nano datetime64/timedelta64 with non-nanosecond resolution
Projects
None yet
Development

No branches or pull requests

4 participants