Skip to content

cftime autochunking does not work with kerchunk reference datasets (& presumably other virtualised data) #10989

@charles-turner-1

Description

@charles-turner-1

What happened?

The cftime autochunking we implemented in #10527 doesn't appear to work with kerchunk reference datasets: see: https://github.com/intake/intake-esm/actions/runs/19956553123/job/57226785198?pr=737:

        if dtype.hasobject:
>           raise NotImplementedError(
                "Can not use auto rechunking with object dtype. "
                "We are unable to estimate the size in bytes of object data"
            )
E           NotImplementedError: Can not use auto rechunking with object dtype. We are unable to estimate the size in bytes of object data

I've only just started prodding into this, but it appears to stem from this:

>>> data
CopyOnWriteArray(array=_ElementwiseFunctionArray(LazilyIndexedArray(array=<xarray.backends.zarr.ZarrArrayWrapper object at 0x1663f2a80>, key=BasicIndexer((slice(None, None, None),))), func=functools.partial(<function _apply_mask at 0x154e240e0>, encoded_fill_values={np.bytes_(b' ')}, decoded_fill_value=nan, dtype=dtype('O')), dtype=dtype('O')))
>>> _contains_cftime_datetimes(data)
False

which then bypasses the dtype faking logic we implemented:

limit: int | None
if _contains_cftime_datetimes(data):
    limit, dtype = fake_target_chunksize(data, chunkmanager.get_auto_chunk_size())
else:
    limit = None
    dtype = data.dtype

I haven't tested with other virtualisation formats/methods, but I assume it will similarly fail for eg. a virtual icechunk store.

Gonna dig into this myself - just flagging it so others are aware


Will fill this out as I progress investigating the issue.

What did you expect to happen?

No response

Minimal Complete Verifiable Example

# /// script
# requires-python = ">=3.11"
# dependencies = [
#   "xarray[complete]@git+https://github.com/pydata/xarray.git@main",
#   "s3fs=0.4.2",
# ]
# ///
#
# This script automatically imports the development branch of xarray to check for issues.
# Please delete this header if you have _not_ tested this script with `uv run`!

import xarray as xr
xr.show_versions()
import s3fs

xarray_open_kwargs = {'engine': 'zarr', 'chunks': 'auto', 
                      'backend_kwargs': {
                          'storage_options': {
                              'remote_protocol': 's3', 
                              'remote_options': {
                                'anon': True, 'asynchronous': True
                                },
                              'fo': 'https://storage.googleapis.com/xr-10989/noaa-nwm-test-reference.json'},
                              'consolidated': False},
                              'use_cftime': True,
                               'consolidated': False}

xr.open_dataset("reference://", **xarray_open_kwargs)

Steps to reproduce

No response

MVCE confirmation

  • Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • Complete example — the example is self-contained, including all data and the text of any traceback.
  • Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • New issue — a search of GitHub Issues suggests this is not a duplicate.
  • Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

Anything else we need to know?

No response

Environment

Details

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions