Skip to content
forked from pydata/xarray

Commit

Permalink
Delete pynio backend. (pydata#8971)
Browse files Browse the repository at this point in the history
* Delete pynio backend.

* cleanup test

* fix whats-new
  • Loading branch information
dcherian authored Apr 25, 2024
1 parent b003674 commit 8a23e24
Show file tree
Hide file tree
Showing 10 changed files with 16 additions and 249 deletions.
1 change: 0 additions & 1 deletion .binder/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ dependencies:
- pip
- pooch
- pydap
- pynio
- rasterio
- scipy
- seaborn
Expand Down
3 changes: 0 additions & 3 deletions doc/getting-started-guide/installing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,6 @@ For netCDF and IO
- `pydap <https://www.pydap.org>`__: used as a fallback for accessing OPeNDAP
- `h5netcdf <https://github.com/h5netcdf/h5netcdf>`__: an alternative library for
reading and writing netCDF4 files that does not use the netCDF-C libraries
- `PyNIO <https://www.pyngl.ucar.edu/Nio.shtml>`__: for reading GRIB and other
geoscience specific file formats. Note that PyNIO is not available for Windows and
that the PyNIO backend may be moved outside of xarray in the future.
- `zarr <https://zarr.readthedocs.io>`__: for chunked, compressed, N-dimensional arrays.
- `cftime <https://unidata.github.io/cftime>`__: recommended if you
want to encode/decode datetimes for non-standard calendars or dates before
Expand Down
21 changes: 0 additions & 21 deletions doc/user-guide/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1294,27 +1294,6 @@ We recommend installing cfgrib via conda::

.. _cfgrib: https://github.com/ecmwf/cfgrib

.. _io.pynio:

Formats supported by PyNIO
--------------------------

.. warning::

The `PyNIO backend is deprecated`_. `PyNIO is no longer maintained`_.

Xarray can also read GRIB, HDF4 and other file formats supported by PyNIO_,
if PyNIO is installed. To use PyNIO to read such files, supply
``engine='pynio'`` to :py:func:`open_dataset`.

We recommend installing PyNIO via conda::

conda install -c conda-forge pynio

.. _PyNIO: https://www.pyngl.ucar.edu/Nio.shtml
.. _PyNIO backend is deprecated: https://github.com/pydata/xarray/issues/4491
.. _PyNIO is no longer maintained: https://github.com/NCAR/pynio/issues/53


CSV and other formats supported by pandas
-----------------------------------------
Expand Down
5 changes: 3 additions & 2 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,8 @@ New Features

Breaking changes
~~~~~~~~~~~~~~~~
- The PyNIO backend has been deleted (:issue:`4491`, :pull:`7301`).
By `Deepak Cherian <https://github.com/dcherian>`_.


Bug fixes
Expand Down Expand Up @@ -6806,8 +6808,7 @@ Enhancements
datasets with a MultiIndex to a netCDF file. User contributions in this
area would be greatly appreciated.

- Support for reading GRIB, HDF4 and other file formats via PyNIO_. See
:ref:`io.pynio` for more details.
- Support for reading GRIB, HDF4 and other file formats via PyNIO_.
- Better error message when a variable is supplied with the same name as
one of its dimensions.
- Plotting: more control on colormap parameters (:issue:`642`). ``vmin`` and
Expand Down
2 changes: 0 additions & 2 deletions xarray/backends/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@
from xarray.backends.netCDF4_ import NetCDF4BackendEntrypoint, NetCDF4DataStore
from xarray.backends.plugins import list_engines, refresh_engines
from xarray.backends.pydap_ import PydapBackendEntrypoint, PydapDataStore
from xarray.backends.pynio_ import NioDataStore
from xarray.backends.scipy_ import ScipyBackendEntrypoint, ScipyDataStore
from xarray.backends.store import StoreBackendEntrypoint
from xarray.backends.zarr import ZarrBackendEntrypoint, ZarrStore
Expand All @@ -30,7 +29,6 @@
"InMemoryDataStore",
"NetCDF4DataStore",
"PydapDataStore",
"NioDataStore",
"ScipyDataStore",
"H5NetCDFStore",
"ZarrStore",
Expand Down
19 changes: 9 additions & 10 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
T_NetcdfEngine = Literal["netcdf4", "scipy", "h5netcdf"]
T_Engine = Union[
T_NetcdfEngine,
Literal["pydap", "pynio", "zarr"],
Literal["pydap", "zarr"],
type[BackendEntrypoint],
str, # no nice typing support for custom backends
None,
Expand All @@ -79,7 +79,6 @@
"scipy": backends.ScipyDataStore,
"pydap": backends.PydapDataStore.open,
"h5netcdf": backends.H5NetCDFStore.open,
"pynio": backends.NioDataStore,
"zarr": backends.ZarrStore.open_group,
}

Expand Down Expand Up @@ -420,8 +419,8 @@ def open_dataset(
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"zarr", None}, installed backend \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "zarr", None}\
, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down Expand Up @@ -523,7 +522,7 @@ def open_dataset(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio".
"scipy".
See engine open function for kwargs accepted by each specific engine.
Expand Down Expand Up @@ -627,8 +626,8 @@ def open_dataarray(
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"zarr", None}, installed backend \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "zarr", None}\
, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down Expand Up @@ -728,7 +727,7 @@ def open_dataarray(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio".
"scipy".
See engine open function for kwargs accepted by each specific engine.
Expand Down Expand Up @@ -897,8 +896,8 @@ def open_mfdataset(
If provided, call this function on each dataset prior to concatenation.
You can find the file-name from which each dataset was loaded in
``ds.encoding["source"]``.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"zarr", None}, installed backend \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "zarr", None}\
, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down
164 changes: 0 additions & 164 deletions xarray/backends/pynio_.py

This file was deleted.

1 change: 0 additions & 1 deletion xarray/tests/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,6 @@ def _importorskip(
)

has_h5netcdf, requires_h5netcdf = _importorskip("h5netcdf")
has_pynio, requires_pynio = _importorskip("Nio")
has_cftime, requires_cftime = _importorskip("cftime")
has_dask, requires_dask = _importorskip("dask")
with warnings.catch_warnings():
Expand Down
Loading

0 comments on commit 8a23e24

Please sign in to comment.