Skip to content
forked from pydata/xarray

Commit

Permalink
Merge remote-tracking branch 'upstream/master' into fix/groupby-nan
Browse files Browse the repository at this point in the history
* upstream/master:
  Whatsnew for pydata#3419 (pydata#3422)
  Revert changes made in pydata#3358 (pydata#3411)
  Python3.6 idioms (pydata#3419)
  Temporarily mark pseudonetcdf-3.1 as incompatible (pydata#3420)
  Fix and add test for groupby_bins() isnan TypeError. (pydata#3405)
  Update where docstring to make return value type more clear (pydata#3408)
  tests for arrays with units (pydata#3238)
  • Loading branch information
dcherian committed Oct 21, 2019
2 parents cfe87e0 + b0c336f commit 43aeffd
Show file tree
Hide file tree
Showing 67 changed files with 1,905 additions and 229 deletions.
1 change: 1 addition & 0 deletions ci/requirements/py36-min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ dependencies:
- numba=0.44
- numpy=1.14
- pandas=0.24
# - pint # See py36-min-nep18.yml
- pip
- pseudonetcdf=3.0
- pydap=3.2
Expand Down
3 changes: 2 additions & 1 deletion ci/requirements/py36-min-nep18.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,15 @@ name: xarray-tests
channels:
- conda-forge
dependencies:
# Optional dependencies that require NEP18, such as sparse,
# Optional dependencies that require NEP18, such as sparse and pint,
# require drastically newer packages than everything else
- python=3.6
- coveralls
- dask=2.4
- distributed=2.4
- numpy=1.17
- pandas=0.24
- pint=0.9 # Actually not enough as it doesn't implement __array_function__yet!
- pytest
- pytest-cov
- pytest-env
Expand Down
3 changes: 2 additions & 1 deletion ci/requirements/py36.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,9 @@ dependencies:
- numba
- numpy
- pandas
- pint
- pip
- pseudonetcdf
- pseudonetcdf<3.1 # FIXME https://github.com/pydata/xarray/issues/3409
- pydap
- pynio
- pytest
Expand Down
3 changes: 2 additions & 1 deletion ci/requirements/py37-windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,9 @@ dependencies:
- numba
- numpy
- pandas
- pint
- pip
- pseudonetcdf
- pseudonetcdf<3.1 # FIXME https://github.com/pydata/xarray/issues/3409
- pydap
# - pynio # Not available on Windows
- pytest
Expand Down
3 changes: 2 additions & 1 deletion ci/requirements/py37.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,9 @@ dependencies:
- numba
- numpy
- pandas
- pint
- pip
- pseudonetcdf
- pseudonetcdf<3.1 # FIXME https://github.com/pydata/xarray/issues/3409
- pydap
- pynio
- pytest
Expand Down
1 change: 0 additions & 1 deletion doc/gallery/plot_cartopy_facetgrid.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""
==================================
Multiple plots and map projections
Expand Down
1 change: 0 additions & 1 deletion doc/gallery/plot_colorbar_center.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""
==================
Centered colormaps
Expand Down
1 change: 0 additions & 1 deletion doc/gallery/plot_control_colorbar.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""
===========================
Control the plot's colorbar
Expand Down
1 change: 0 additions & 1 deletion doc/gallery/plot_lines_from_2d.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""
==================================
Multiple lines from a 2d DataArray
Expand Down
1 change: 0 additions & 1 deletion doc/gallery/plot_rasterio.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""
.. _recipes.rasterio:
Expand Down
1 change: 0 additions & 1 deletion doc/gallery/plot_rasterio_rgb.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""
.. _recipes.rasterio_rgb:
Expand Down
11 changes: 10 additions & 1 deletion doc/installing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,15 @@ For plotting
Alternative data containers
~~~~~~~~~~~~~~~~~~~~~~~~~~~
- `sparse <https://sparse.pydata.org/>`_: for sparse arrays
- `pint <https://pint.readthedocs.io/>`_: for units of measure

.. note::

At the moment of writing, xarray requires a `highly experimental version of pint
<https://github.com/andrewgsavage/pint/pull/6>`_ (install with
``pip install git+https://github.com/andrewgsavage/pint.git@refs/pull/6/head)``.
Even with it, interaction with non-numpy array libraries, e.g. dask or sparse, is broken.

- Any numpy-like objects that support
`NEP-18 <https://numpy.org/neps/nep-0018-array-function-protocol.html>`_.
Note that while such libraries theoretically should work, they are untested.
Expand All @@ -85,7 +94,7 @@ dependencies:
(`NEP-29 <https://numpy.org/neps/nep-0029-deprecation_policy.html>`_)
- **pandas:** 12 months
- **scipy:** 12 months
- **sparse** and other libraries that rely on
- **sparse, pint** and other libraries that rely on
`NEP-18 <https://numpy.org/neps/nep-0018-array-function-protocol.html>`_
for integration: very latest available versions only, until the technology will have
matured. This extends to dask when used in conjunction with any of these libraries.
Expand Down
24 changes: 24 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,24 @@ Bug fixes
- Fix grouping over variables with NaNs. (:issue:`2383`, :pull:`3406`).
By `Deepak Cherian <https://github.com/dcherian>`_.

New Features
~~~~~~~~~~~~
- Added integration tests against `pint <https://pint.readthedocs.io/>`_.
(:pull:`3238`) by `Justus Magin <https://github.com/keewis>`_.

.. note::

At the moment of writing, these tests *as well as the ability to use pint in general*
require `a highly experimental version of pint
<https://github.com/andrewgsavage/pint/pull/6>`_ (install with
``pip install git+https://github.com/andrewgsavage/pint.git@refs/pull/6/head)``.
Even with it, interaction with non-numpy array libraries, e.g. dask or sparse, is broken.

Bug fixes
~~~~~~~~~
- Fix regression introduced in v0.14.0 that would cause a crash if dask is installed
but cloudpickle isn't (:issue:`3401`) by `Rhys Doyle <https://github.com/rdoyle45>`_

Documentation
~~~~~~~~~~~~~

Expand All @@ -32,6 +50,12 @@ Documentation
datetime-like dimension is required. (:pull:`3400`)
By `Justus Magin <https://github.com/keewis>`_.

Internal Changes
~~~~~~~~~~~~~~~~

- Use Python 3.6 idioms throughout the codebase. (:pull:3419)
By `Maximilian Roos <https://github.com/max-sixty>`_

.. _whats-new.0.14.0:

v0.14.0 (14 Oct 2019)
Expand Down
2 changes: 2 additions & 0 deletions setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,8 @@ ignore_missing_imports = True
ignore_missing_imports = True
[mypy-pandas.*]
ignore_missing_imports = True
[mypy-pint.*]
ignore_missing_imports = True
[mypy-PseudoNetCDF.*]
ignore_missing_imports = True
[mypy-pydap.*]
Expand Down
7 changes: 3 additions & 4 deletions xarray/_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, env=
return None, None
else:
if verbose:
print("unable to find command, tried %s" % (commands,))
print(f"unable to find command, tried {commands}")
return None, None
stdout = p.communicate()[0].strip()
if sys.version_info[0] >= 3:
Expand Down Expand Up @@ -302,9 +302,8 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
if verbose:
fmt = "tag '%s' doesn't start with prefix '%s'"
print(fmt % (full_tag, tag_prefix))
pieces["error"] = "tag '%s' doesn't start with prefix '%s'" % (
full_tag,
tag_prefix,
pieces["error"] = "tag '{}' doesn't start with prefix '{}'".format(
full_tag, tag_prefix
)
return pieces
pieces["closest-tag"] = full_tag[len(tag_prefix) :]
Expand Down
6 changes: 2 additions & 4 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -718,7 +718,7 @@ def open_mfdataset(
autoclose=None,
parallel=False,
join="outer",
**kwargs
**kwargs,
):
"""Open multiple files as a single dataset.
Expand Down Expand Up @@ -1258,9 +1258,7 @@ def _validate_append_dim_and_encoding(
return
if append_dim:
if append_dim not in ds.dims:
raise ValueError(
"{} not a valid dimension in the Dataset".format(append_dim)
)
raise ValueError(f"{append_dim} not a valid dimension in the Dataset")
for data_var in ds_to_append:
if data_var in ds:
if append_dim is None:
Expand Down
4 changes: 2 additions & 2 deletions xarray/backends/file_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ def __init__(
kwargs=None,
lock=None,
cache=None,
ref_counts=None
ref_counts=None,
):
"""Initialize a FileManager.
Expand Down Expand Up @@ -267,7 +267,7 @@ def __setstate__(self, state):
def __repr__(self):
args_string = ", ".join(map(repr, self._args))
if self._mode is not _DEFAULT_MODE:
args_string += ", mode={!r}".format(self._mode)
args_string += f", mode={self._mode!r}"
return "{}({!r}, {}, kwargs={})".format(
type(self).__name__, self._opener, args_string, self._kwargs
)
Expand Down
17 changes: 11 additions & 6 deletions xarray/backends/locks.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import multiprocessing
import threading
import weakref
from typing import Any, MutableMapping
from typing import Any, MutableMapping, Optional

try:
from dask.utils import SerializableLock
Expand Down Expand Up @@ -62,7 +62,7 @@ def _get_lock_maker(scheduler=None):
return _LOCK_MAKERS[scheduler]


def _get_scheduler(get=None, collection=None):
def _get_scheduler(get=None, collection=None) -> Optional[str]:
"""Determine the dask scheduler that is being used.
None is returned if no dask scheduler is active.
Expand All @@ -86,10 +86,15 @@ def _get_scheduler(get=None, collection=None):
except (ImportError, AttributeError):
pass

if actual_get is dask.multiprocessing.get:
return "multiprocessing"
else:
return "threaded"
try:
# As of dask=2.6, dask.multiprocessing requires cloudpickle to be installed
# Dependency removed in https://github.com/dask/dask/pull/5511
if actual_get is dask.multiprocessing.get:
return "multiprocessing"
except AttributeError:
pass

return "threaded"


def get_write_lock(key):
Expand Down
2 changes: 1 addition & 1 deletion xarray/backends/netCDF4_.py
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,7 @@ def _nc4_dtype(var):
elif var.dtype.kind in ["i", "u", "f", "c", "S"]:
dtype = var.dtype
else:
raise ValueError("unsupported dtype for netCDF4 variable: {}".format(var.dtype))
raise ValueError(f"unsupported dtype for netCDF4 variable: {var.dtype}")
return dtype


Expand Down
2 changes: 1 addition & 1 deletion xarray/backends/netcdf3.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def coerce_nc3_dtype(arr):
cast_arr = arr.astype(new_dtype)
if not (cast_arr == arr).all():
raise ValueError(
"could not safely cast array from dtype %s to %s" % (dtype, new_dtype)
f"could not safely cast array from dtype {dtype} to {new_dtype}"
)
arr = cast_arr
return arr
Expand Down
2 changes: 1 addition & 1 deletion xarray/backends/pydap_.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ def _fix_attributes(attributes):
# dot-separated key
attributes.update(
{
"{}.{}".format(k, k_child): v_child
f"{k}.{k_child}": v_child
for k_child, v_child in attributes.pop(k).items()
}
)
Expand Down
10 changes: 3 additions & 7 deletions xarray/coding/cftime_offsets.py
Original file line number Diff line number Diff line change
Expand Up @@ -638,7 +638,7 @@ def __apply__(self, other):


_FREQUENCY_CONDITION = "|".join(_FREQUENCIES.keys())
_PATTERN = r"^((?P<multiple>\d+)|())(?P<freq>({}))$".format(_FREQUENCY_CONDITION)
_PATTERN = fr"^((?P<multiple>\d+)|())(?P<freq>({_FREQUENCY_CONDITION}))$"


# pandas defines these offsets as "Tick" objects, which for instance have
Expand Down Expand Up @@ -759,19 +759,15 @@ def _generate_range(start, end, periods, offset):

next_date = current + offset
if next_date <= current:
raise ValueError(
"Offset {offset} did not increment date".format(offset=offset)
)
raise ValueError(f"Offset {offset} did not increment date")
current = next_date
else:
while current >= end:
yield current

next_date = current + offset
if next_date >= current:
raise ValueError(
"Offset {offset} did not decrement date".format(offset=offset)
)
raise ValueError(f"Offset {offset} did not decrement date")
current = next_date


Expand Down
2 changes: 1 addition & 1 deletion xarray/coding/cftimeindex.py
Original file line number Diff line number Diff line change
Expand Up @@ -403,7 +403,7 @@ def shift(self, n, freq):
from .cftime_offsets import to_offset

if not isinstance(n, int):
raise TypeError("'n' must be an int, got {}.".format(n))
raise TypeError(f"'n' must be an int, got {n}.")
if isinstance(freq, timedelta):
return self + n * freq
elif isinstance(freq, str):
Expand Down
2 changes: 1 addition & 1 deletion xarray/coding/strings.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ def shape(self):
return self.array.shape[:-1]

def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.array)
return "{}({!r})".format(type(self).__name__, self.array)

def __getitem__(self, key):
# require slicing the last dimension completely
Expand Down
4 changes: 2 additions & 2 deletions xarray/coding/times.py
Original file line number Diff line number Diff line change
Expand Up @@ -286,7 +286,7 @@ def infer_datetime_units(dates):
# NumPy casting bug: https://github.com/numpy/numpy/issues/11096
unique_timedeltas = to_timedelta_unboxed(unique_timedeltas)
units = _infer_time_units_from_diff(unique_timedeltas)
return "%s since %s" % (units, reference_date)
return f"{units} since {reference_date}"


def format_cftime_datetime(date):
Expand Down Expand Up @@ -341,7 +341,7 @@ def cftime_to_nptime(times):
def _cleanup_netcdf_time_units(units):
delta, ref_date = _unpack_netcdf_time_units(units)
try:
units = "%s since %s" % (delta, format_timestamp(ref_date))
units = "{} since {}".format(delta, format_timestamp(ref_date))
except OutOfBoundsDatetime:
# don't worry about reifying the units if they're out of bounds
pass
Expand Down
9 changes: 3 additions & 6 deletions xarray/coding/variables.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,11 +73,8 @@ def __array__(self, dtype=None):
return self.func(self.array)

def __repr__(self):
return "%s(%r, func=%r, dtype=%r)" % (
type(self).__name__,
self.array,
self.func,
self.dtype,
return "{}({!r}, func={!r}, dtype={!r})".format(
type(self).__name__, self.array, self.func, self.dtype
)


Expand Down Expand Up @@ -113,7 +110,7 @@ def unpack_for_decoding(var):

def safe_setitem(dest, key, value, name=None):
if key in dest:
var_str = " on variable {!r}".format(name) if name else ""
var_str = f" on variable {name!r}" if name else ""
raise ValueError(
"failed to prevent overwriting existing key {} in attrs{}. "
"This is probably an encoding field used by xarray to describe "
Expand Down
Loading

0 comments on commit 43aeffd

Please sign in to comment.