Skip to content

Commit

Permalink
Merge branch 'master' into maahn-groupy_plot2
Browse files Browse the repository at this point in the history
* master: (51 commits)
  xarray.backends refactor (pydata#2261)
  Fix indexing error for data loaded with open_rasterio (pydata#2456)
  Properly support user-provided norm. (pydata#2443)
  pep8speaks (pydata#2462)
  isort (pydata#2469)
  tests shoudn't need to pass for a PR (pydata#2471)
  Replace the last of unittest with pytest (pydata#2467)
  Add python_requires to setup.py (pydata#2465)
  Update whats-new.rst (pydata#2466)
  Clean up _parse_array_of_cftime_strings (pydata#2464)
  plot.contour: Don't make cmap if colors is a single color. (pydata#2453)
  np.AxisError was added in numpy 1.13 (pydata#2455)
  Add CFTimeIndex.shift (pydata#2431)
  Fix FutureWarning in CFTimeIndex.date_type (pydata#2448)
  fix:2445 (pydata#2446)
  Enable use of cftime.datetime coordinates with differentiate and interp (pydata#2434)
  restore ddof support in std (pydata#2447)
  Future warning for default reduction dimension of groupby (pydata#2366)
  Remove incorrect statement about "drop" in the text docs (pydata#2439)
  Use profile mechanism, not no-op mutation (pydata#2442)
  ...
  • Loading branch information
dcherian committed Oct 10, 2018
2 parents 87ef1cc + 289b377 commit 826df44
Show file tree
Hide file tree
Showing 104 changed files with 6,486 additions and 2,051 deletions.
1 change: 0 additions & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
- [ ] Closes #xxxx (remove if there is no corresponding issue, which should only be the case for minor changes)
- [ ] Tests added (for all bug fixes or enhancements)
- [ ] Tests passed (for all non-documentation changes)
- [ ] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
11 changes: 11 additions & 0 deletions .pep8speaks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# File : .pep8speaks.yml

scanner:
diff_only: True # If True, errors caused by only the patch are shown

pycodestyle:
max-line-length: 79
ignore: # Errors and warnings to ignore
- E402, # module level import not at top of file
- E731, # do not assign a lambda expression, use a def
- W503 # line break before binary operator
11 changes: 0 additions & 11 deletions .stickler.yml

This file was deleted.

82 changes: 29 additions & 53 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Based on http://conda.pydata.org/docs/travis.html
language: python
language: minimal
sudo: false # use container based build
notifications:
email: false
Expand All @@ -10,72 +10,48 @@ branches:
matrix:
fast_finish: true
include:
- python: 2.7
env: CONDA_ENV=py27-min
- python: 2.7
env: CONDA_ENV=py27-cdat+iris+pynio
- python: 3.5
env: CONDA_ENV=py35
- python: 3.6
env: CONDA_ENV=py36
- python: 3.6
env:
- env: CONDA_ENV=py27-min
- env: CONDA_ENV=py27-cdat+iris+pynio
- env: CONDA_ENV=py35
- env: CONDA_ENV=py36
- env: CONDA_ENV=py37
- env:
- CONDA_ENV=py36
- EXTRA_FLAGS="--run-flaky --run-network-tests"
- python: 3.6
env: CONDA_ENV=py36-netcdf4-dev
- env: CONDA_ENV=py36-netcdf4-dev
addons:
apt_packages:
- libhdf5-serial-dev
- netcdf-bin
- libnetcdf-dev
- python: 3.6
env: CONDA_ENV=py36-dask-dev
- python: 3.6
env: CONDA_ENV=py36-pandas-dev
- python: 3.6
env: CONDA_ENV=py36-bottleneck-dev
- python: 3.6
env: CONDA_ENV=py36-condaforge-rc
- python: 3.6
env: CONDA_ENV=py36-pynio-dev
- python: 3.6
env: CONDA_ENV=py36-rasterio-0.36
- python: 3.6
env: CONDA_ENV=py36-zarr-dev
- python: 3.5
env: CONDA_ENV=docs
- python: 3.6
env: CONDA_ENV=py36-hypothesis
- env: CONDA_ENV=py36-dask-dev
- env: CONDA_ENV=py36-pandas-dev
- env: CONDA_ENV=py36-bottleneck-dev
- env: CONDA_ENV=py36-condaforge-rc
- env: CONDA_ENV=py36-pynio-dev
- env: CONDA_ENV=py36-rasterio-0.36
- env: CONDA_ENV=py36-zarr-dev
- env: CONDA_ENV=docs
- env: CONDA_ENV=py36-hypothesis

allow_failures:
- python: 3.6
env:
- env:
- CONDA_ENV=py36
- EXTRA_FLAGS="--run-flaky --run-network-tests"
- python: 3.6
env: CONDA_ENV=py36-netcdf4-dev
- env: CONDA_ENV=py36-netcdf4-dev
addons:
apt_packages:
- libhdf5-serial-dev
- netcdf-bin
- libnetcdf-dev
- python: 3.6
env: CONDA_ENV=py36-pandas-dev
- python: 3.6
env: CONDA_ENV=py36-bottleneck-dev
- python: 3.6
env: CONDA_ENV=py36-condaforge-rc
- python: 3.6
env: CONDA_ENV=py36-pynio-dev
- python: 3.6
env: CONDA_ENV=py36-zarr-dev
- env: CONDA_ENV=py36-pandas-dev
- env: CONDA_ENV=py36-bottleneck-dev
- env: CONDA_ENV=py36-condaforge-rc
- env: CONDA_ENV=py36-pynio-dev
- env: CONDA_ENV=py36-zarr-dev

before_install:
- if [[ "$TRAVIS_PYTHON_VERSION" == "2.7" ]]; then
wget http://repo.continuum.io/miniconda/Miniconda-3.16.0-Linux-x86_64.sh -O miniconda.sh;
else
wget http://repo.continuum.io/miniconda/Miniconda3-3.16.0-Linux-x86_64.sh -O miniconda.sh;
fi
- wget http://repo.continuum.io/miniconda/Miniconda3-3.16.0-Linux-x86_64.sh -O miniconda.sh;
- bash miniconda.sh -b -p $HOME/miniconda
- export PATH="$HOME/miniconda/bin:$PATH"
- hash -r
Expand All @@ -95,9 +71,9 @@ install:
- python xarray/util/print_versions.py

script:
# TODO: restore this check once the upstream pandas issue is fixed:
# https://github.com/pandas-dev/pandas/issues/21071
# - python -OO -c "import xarray"
- which python
- python --version
- python -OO -c "import xarray"
- if [[ "$CONDA_ENV" == "docs" ]]; then
conda install -c conda-forge sphinx sphinx_rtd_theme sphinx-gallery numpydoc;
sphinx-build -n -j auto -b html -d _build/doctrees doc _build/html;
Expand Down
22 changes: 20 additions & 2 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ xarray: N-D labeled arrays and datasets
:target: https://zenodo.org/badge/latestdoi/13221727
.. image:: http://img.shields.io/badge/benchmarked%20by-asv-green.svg?style=flat
:target: http://pandas.pydata.org/speed/xarray/
.. image:: https://img.shields.io/badge/powered%20by-NumFOCUS-orange.svg?style=flat&colorA=E1523D&colorB=007D8A
:target: http://numfocus.org

**xarray** (formerly **xray**) is an open source project and Python package that aims to bring the
labeled data power of pandas_ to the physical sciences, by providing
Expand Down Expand Up @@ -103,20 +105,36 @@ Get in touch
.. _mailing list: https://groups.google.com/forum/#!forum/xarray
.. _on GitHub: http://github.com/pydata/xarray

NumFOCUS
--------

.. image:: https://numfocus.org/wp-content/uploads/2017/07/NumFocus_LRG.png
:scale: 25 %
:target: https://numfocus.org/

Xarray is a fiscally sponsored project of NumFOCUS_, a nonprofit dedicated
to supporting the open source scientific computing community. If you like
Xarray and want to support our mission, please consider making a donation_
to support our efforts.

.. _donation: https://www.flipcause.com/secure/cause_pdetails/NDE2NTU=

History
-------

xarray is an evolution of an internal tool developed at `The Climate
Corporation`__. It was originally written by Climate Corp researchers Stephan
Hoyer, Alex Kleeman and Eugene Brevdo and was released as open source in
May 2014. The project was renamed from "xray" in January 2016.
May 2014. The project was renamed from "xray" in January 2016. Xarray became a
fiscally sponsored project of NumFOCUS_ in August 2018.

__ http://climate.com/
.. _NumFOCUS: https://numfocus.org

License
-------

Copyright 2014-2017, xarray Developers
Copyright 2014-2018, xarray Developers

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
Expand Down
1 change: 1 addition & 0 deletions asv_bench/asv.conf.json
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@
"scipy": [""],
"bottleneck": ["", null],
"dask": [""],
"distributed": [""],
},


Expand Down
43 changes: 42 additions & 1 deletion asv_bench/benchmarks/dataset_io.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
from __future__ import absolute_import, division, print_function

import os

import numpy as np
import pandas as pd

import xarray as xr

from . import randn, randint, requires_dask
from . import randint, randn, requires_dask

try:
import dask
Expand All @@ -14,6 +16,9 @@
pass


os.environ['HDF5_USE_FILE_LOCKING'] = 'FALSE'


class IOSingleNetCDF(object):
"""
A few examples that benchmark reading/writing a single netCDF file with
Expand Down Expand Up @@ -405,3 +410,39 @@ def time_open_dataset_scipy_with_time_chunks(self):
with dask.set_options(get=dask.multiprocessing.get):
xr.open_mfdataset(self.filenames_list, engine='scipy',
chunks=self.time_chunks)


def create_delayed_write():
import dask.array as da
vals = da.random.random(300, chunks=(1,))
ds = xr.Dataset({'vals': (['a'], vals)})
return ds.to_netcdf('file.nc', engine='netcdf4', compute=False)


class IOWriteNetCDFDask(object):
timeout = 60
repeat = 1
number = 5

def setup(self):
requires_dask()
self.write = create_delayed_write()

def time_write(self):
self.write.compute()


class IOWriteNetCDFDaskDistributed(object):
def setup(self):
try:
import distributed
except ImportError:
raise NotImplementedError
self.client = distributed.Client()
self.write = create_delayed_write()

def cleanup(self):
self.client.shutdown()

def time_write(self):
self.write.compute()
26 changes: 26 additions & 0 deletions asv_bench/benchmarks/unstacking.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
from __future__ import absolute_import, division, print_function

import numpy as np

import xarray as xr

from . import requires_dask


class Unstacking(object):
def setup(self):
data = np.random.RandomState(0).randn(1, 1000, 500)
self.ds = xr.DataArray(data).stack(flat_dim=['dim_1', 'dim_2'])

def time_unstack_fast(self):
self.ds.unstack('flat_dim')

def time_unstack_slow(self):
self.ds[:, ::-1].unstack('flat_dim')


class UnstackingDask(Unstacking):
def setup(self, *args, **kwargs):
requires_dask()
super(UnstackingDask, self).setup(**kwargs)
self.ds = self.ds.chunk({'flat_dim': 50})
13 changes: 13 additions & 0 deletions ci/requirements-py37.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
name: test_env
channels:
- defaults
dependencies:
- python=3.7
- pip:
- pytest
- flake8
- mock
- numpy
- pandas
- coveralls
- pytest-cov
Binary file added doc/_static/numfocus_logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions doc/api-hidden.rst
Original file line number Diff line number Diff line change
Expand Up @@ -151,3 +151,5 @@
plot.FacetGrid.set_titles
plot.FacetGrid.set_ticks
plot.FacetGrid.map

CFTimeIndex.shift
12 changes: 12 additions & 0 deletions doc/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -150,6 +150,7 @@ Computation
Dataset.resample
Dataset.diff
Dataset.quantile
Dataset.differentiate

**Aggregation**:
:py:attr:`~Dataset.all`
Expand Down Expand Up @@ -317,6 +318,7 @@ Computation
DataArray.diff
DataArray.dot
DataArray.quantile
DataArray.differentiate

**Aggregation**:
:py:attr:`~DataArray.all`
Expand Down Expand Up @@ -555,6 +557,13 @@ Custom Indexes

CFTimeIndex

Creating custom indexes
-----------------------
.. autosummary::
:toctree: generated/

cftime_range

Plotting
========

Expand Down Expand Up @@ -615,3 +624,6 @@ arguments for the ``from_store`` and ``dump_to_store`` Dataset methods:
backends.H5NetCDFStore
backends.PydapDataStore
backends.ScipyDataStore
backends.FileManager
backends.CachingFileManager
backends.DummyFileManager
25 changes: 25 additions & 0 deletions doc/computation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -200,6 +200,31 @@ You can also use ``construct`` to compute a weighted rolling sum:
To avoid this, use ``skipna=False`` as the above example.


Computation using Coordinates
=============================

Xarray objects have some handy methods for the computation with their
coordinates. :py:meth:`~xarray.DataArray.differentiate` computes derivatives by
central finite differences using their coordinates,

.. ipython:: python
a = xr.DataArray([0, 1, 2, 3], dims=['x'], coords=[[0.1, 0.11, 0.2, 0.3]])
a
a.differentiate('x')
This method can be used also for multidimensional arrays,

.. ipython:: python
a = xr.DataArray(np.arange(8).reshape(4, 2), dims=['x', 'y'],
coords={'x': [0.1, 0.11, 0.2, 0.3]})
a.differentiate('x')
.. note::
This method is limited to simple cartesian geometry. Differentiation along
multidimensional coordinate is not supported.

.. _compute.broadcasting:

Broadcasting by dimension name
Expand Down
7 changes: 0 additions & 7 deletions doc/data-structures.rst
Original file line number Diff line number Diff line change
Expand Up @@ -408,13 +408,6 @@ operations keep around coordinates:
list(ds[['x']])
list(ds.drop('temperature'))
If a dimension name is given as an argument to ``drop``, it also drops all
variables that use that dimension:

.. ipython:: python
list(ds.drop('time'))
As an alternate to dictionary-like modifications, you can use
:py:meth:`~xarray.Dataset.assign` and :py:meth:`~xarray.Dataset.assign_coords`.
These methods return a new dataset with additional (or replaced) or values:
Expand Down
Loading

0 comments on commit 826df44

Please sign in to comment.