Closed
Description
Curently, Dataset.to_zarr
will only write Zarr datasets in cases in which
- The Dataset arrays are in memory (no dask)
- The arrays are chunked with dask with a one-to-many relationship between dask chunks and zarr chunks
If I try to violate the one-to-many condition, I get an error
import xarray as xr
ds = xr.DataArray([0, 1., 2], name='foo').chunk({'dim_0': 1}).to_dataset()
d = ds.to_zarr('test.zarr', encoding={'foo': {'chunks': (3,)}}, compute=False)
/srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/zarr.py in _determine_zarr_chunks(enc_chunks, var_chunks, ndim, name)
148 for dchunk in dchunks[:-1]:
149 if dchunk % zchunk:
--> 150 raise NotImplementedError(
151 f"Specified zarr chunks encoding['chunks']={enc_chunks_tuple!r} for "
152 f"variable named {name!r} would overlap multiple dask chunks {var_chunks!r}. "
NotImplementedError: Specified zarr chunks encoding['chunks']=(3,) for variable named 'foo' would overlap multiple dask chunks ((1, 1, 1),). This is not implemented in xarray yet. Consider either rechunking using `chunk()` or instead deleting or modifying `encoding['chunks']`.
In this case, the error is particularly frustrating because I'm not even writing any data yet. (Also related to #2300, #4046, #4380).
There are at least two scenarios in which we might want to have more flexibility.
- The case above, when we want to lazily initialize a Zarr array based on a Dataset, without actually computing anything.
- The more general case, where we actually write arrays with many-to-many dask-chunk <-> zarr-chunk relationships
For 1, I propose we add a new option like safe_chunks=True
to to_zarr
. safe_chunks=False
would permit just bypassing this chunk.
For 2, we could consider implementing locks. This probably has to be done at the Dask level. But is actually not super hard to deterministically figure out which chunks need to share a lock.