Skip to content
forked from pydata/xarray

Commit

Permalink
Merge remote-tracking branch 'upstream/main' into groupby-plot
Browse files Browse the repository at this point in the history
* upstream/main: (307 commits)
  Use same bool validator as other inputs (pydata#5703)
  conditionally disable bottleneck (pydata#5560)
  Refactor index vs. coordinate variable(s) (pydata#5636)
  pre-commit: autoupdate hook versions (pydata#5685)
  Flexible Indexes: Avoid len(index) in map_blocks (pydata#5670)
  Speed up _mapping_repr (pydata#5661)
  update the link to `scipy`'s intersphinx file (pydata#5665)
  Bump styfle/cancel-workflow-action from 0.9.0 to 0.9.1 (pydata#5663)
  pre-commit: autoupdate hook versions (pydata#5660)
  fix the binder environment (pydata#5650)
  Update api.rst (pydata#5639)
  Kwargs to rasterio open (pydata#5609)
  Bump codecov/codecov-action from 1 to 2.0.2 (pydata#5633)
  new blank whats-new for v0.19.1
  v0.19.0 release notes (pydata#5632)
  remove deprecations scheduled for 0.19 (pydata#5630)
  Make typing-extensions optional (pydata#5624)
  Plots get labels from pint arrays (pydata#5561)
  Add to_numpy() and as_numpy() methods (pydata#5568)
  pin fsspec (pydata#5627)
  ...
  • Loading branch information
dcherian committed Aug 13, 2021
2 parents 59b2fdc + 2705c63 commit 7337ddd
Show file tree
Hide file tree
Showing 214 changed files with 23,526 additions and 7,246 deletions.
3 changes: 2 additions & 1 deletion .binder/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: xarray-examples
channels:
- conda-forge
dependencies:
- python=3.8
- python=3.9
- boto3
- bottleneck
- cartopy
Expand All @@ -26,6 +26,7 @@ dependencies:
- pandas
- pint
- pip
- pooch
- pydap
- pynio
- rasterio
Expand Down
1 change: 1 addition & 0 deletions .git_archival.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
ref-names: $Format:%D$
2 changes: 2 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
# reduce the number of merge conflicts
doc/whats-new.rst merge=union
# allow installing from git archives
.git_archival.txt export-subst
29 changes: 0 additions & 29 deletions .github/actions/detect-ci-trigger/action.yaml

This file was deleted.

47 changes: 0 additions & 47 deletions .github/actions/detect-ci-trigger/script.sh

This file was deleted.

7 changes: 7 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
version: 2
updates:
- package-ecosystem: 'github-actions'
directory: '/'
schedule:
# Check for updates once a week
interval: 'weekly'
15 changes: 15 additions & 0 deletions .github/workflows/cancel-duplicate-runs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
name: Cancel
on:
workflow_run:
workflows: ["CI", "CI Additional", "CI Upstream"]
types:
- requested
jobs:
cancel:
name: Cancel previous runs
runs-on: ubuntu-latest
if: github.repository == 'pydata/xarray'
steps:
- uses: styfle/cancel-workflow-action@0.9.1
with:
workflow_id: ${{ github.event.workflow.id }}
33 changes: 8 additions & 25 deletions .github/workflows/ci-additional.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,16 @@ jobs:
detect-ci-trigger:
name: detect ci trigger
runs-on: ubuntu-latest
if: github.event_name == 'push' || github.event_name == 'pull_request'
if: |
github.repository == 'pydata/xarray'
&& (github.event_name == 'push' || github.event_name == 'pull_request')
outputs:
triggered: ${{ steps.detect-trigger.outputs.trigger-found }}
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 2
- uses: ./.github/actions/detect-ci-trigger
- uses: xarray-contrib/ci-trigger@v1.1
id: detect-trigger
with:
keyword: "[skip-ci]"
Expand All @@ -42,26 +44,16 @@ jobs:
"py37-min-all-deps",
"py37-min-nep18",
"py38-all-but-dask",
"py38-backend-api-v2",
"py38-flaky",
]
steps:
- name: Cancel previous runs
uses: styfle/cancel-workflow-action@0.6.0
with:
access_token: ${{ github.token }}
- uses: actions/checkout@v2
with:
fetch-depth: 0 # Fetch all history for all branches and tags.

- name: Set environment variables
run: |
if [[ ${{ matrix.env }} == "py38-backend-api-v2" ]] ;
then
echo "CONDA_ENV_FILE=ci/requirements/environment.yml" >> $GITHUB_ENV
echo "XARRAY_BACKEND_API=v2" >> $GITHUB_ENV
elif [[ ${{ matrix.env }} == "py38-flaky" ]] ;
if [[ ${{ matrix.env }} == "py38-flaky" ]] ;
then
echo "CONDA_ENV_FILE=ci/requirements/environment.yml" >> $GITHUB_ENV
echo "PYTEST_EXTRA_FLAGS=--run-flaky --run-network-tests" >> $GITHUB_ENV
Expand Down Expand Up @@ -111,7 +103,7 @@ jobs:
$PYTEST_EXTRA_FLAGS
- name: Upload code coverage to Codecov
uses: codecov/codecov-action@v1
uses: codecov/codecov-action@v2.0.2
with:
file: ./coverage.xml
flags: unittests,${{ matrix.env }}
Expand All @@ -121,17 +113,12 @@ jobs:
doctest:
name: Doctests
runs-on: "ubuntu-latest"
needs: detect-ci-trigger
if: needs.detect-ci-trigger.outputs.triggered == 'false'
if: github.repository == 'pydata/xarray'
defaults:
run:
shell: bash -l {0}

steps:
- name: Cancel previous runs
uses: styfle/cancel-workflow-action@0.6.0
with:
access_token: ${{ github.token }}
- uses: actions/checkout@v2
with:
fetch-depth: 0 # Fetch all history for all branches and tags.
Expand Down Expand Up @@ -169,10 +156,6 @@ jobs:
shell: bash -l {0}

steps:
- name: Cancel previous runs
uses: styfle/cancel-workflow-action@0.6.0
with:
access_token: ${{ github.token }}
- uses: actions/checkout@v2
with:
fetch-depth: 0 # Fetch all history for all branches and tags.
Expand All @@ -186,6 +169,6 @@ jobs:

- name: minimum versions policy
run: |
mamba install -y pyyaml conda
mamba install -y pyyaml conda python-dateutil
python ci/min_deps_check.py ci/requirements/py37-bare-minimum.yml
python ci/min_deps_check.py ci/requirements/py37-min-all-deps.yml
44 changes: 44 additions & 0 deletions .github/workflows/ci-pre-commit-autoupdate.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
name: "pre-commit autoupdate CI"

on:
schedule:
- cron: "0 0 * * 0" # every Sunday at 00:00 UTC
workflow_dispatch:


jobs:
autoupdate:
name: 'pre-commit autoupdate'
runs-on: ubuntu-latest
if: github.repository == 'pydata/xarray'
steps:
- name: checkout
uses: actions/checkout@v2
- name: Cache pip and pre-commit
uses: actions/cache@v2
with:
path: |
~/.cache/pre-commit
~/.cache/pip
key: ${{ runner.os }}-pre-commit-autoupdate
- name: setup python
uses: actions/setup-python@v2
- name: upgrade pip
run: python -m pip install --upgrade pip
- name: install dependencies
run: python -m pip install --upgrade pre-commit pyyaml packaging
- name: version info
run: python -m pip list
- name: autoupdate
uses: technote-space/create-pr-action@837dbe469b39f08d416889369a52e2a993625c84
with:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
EXECUTE_COMMANDS: |
python -m pre_commit autoupdate
python -m pre_commit run --all-files
COMMIT_MESSAGE: 'pre-commit: autoupdate hook versions'
COMMIT_NAME: 'github-actions[bot]'
COMMIT_EMAIL: 'github-actions[bot]@users.noreply.github.com'
PR_TITLE: 'pre-commit: autoupdate hook versions'
PR_BRANCH_PREFIX: 'pre-commit/'
PR_BRANCH_NAME: 'autoupdate-${PR_ID}'
3 changes: 2 additions & 1 deletion .github/workflows/ci-pre-commit.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ jobs:
linting:
name: "pre-commit hooks"
runs-on: ubuntu-latest
if: github.repository == 'pydata/xarray'
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: pre-commit/action@v2.0.0
- uses: pre-commit/action@v2.0.3
50 changes: 36 additions & 14 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,16 @@ jobs:
detect-ci-trigger:
name: detect ci trigger
runs-on: ubuntu-latest
if: github.event_name == 'push' || github.event_name == 'pull_request'
if: |
github.repository == 'pydata/xarray'
&& (github.event_name == 'push' || github.event_name == 'pull_request')
outputs:
triggered: ${{ steps.detect-trigger.outputs.trigger-found }}
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 2
- uses: ./.github/actions/detect-ci-trigger
- uses: xarray-contrib/ci-trigger@v1.1
id: detect-trigger
with:
keyword: "[skip-ci]"
Expand All @@ -35,12 +37,9 @@ jobs:
fail-fast: false
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
python-version: ["3.7", "3.8"]
# Bookend python versions
python-version: ["3.7", "3.9"]
steps:
- name: Cancel previous runs
uses: styfle/cancel-workflow-action@0.6.0
with:
access_token: ${{ github.token }}
- uses: actions/checkout@v2
with:
fetch-depth: 0 # Fetch all history for all branches and tags.
Expand All @@ -59,8 +58,7 @@ jobs:
uses: actions/cache@v2
with:
path: ~/conda_pkgs_dir
key:
${{ runner.os }}-conda-py${{ matrix.python-version }}-${{
key: ${{ runner.os }}-conda-py${{ matrix.python-version }}-${{
hashFiles('ci/requirements/**.yml') }}
- uses: conda-incubator/setup-miniconda@v2
with:
Expand Down Expand Up @@ -89,16 +87,40 @@ jobs:
run: |
python -c "import xarray"
- name: Run tests
run: |
python -m pytest -n 4 \
--cov=xarray \
--cov-report=xml
run: python -m pytest -n 4
--cov=xarray
--cov-report=xml
--junitxml=pytest.xml

- name: Upload test results
if: always()
uses: actions/upload-artifact@v2
with:
name: Test results for ${{ runner.os }}-${{ matrix.python-version }}
path: pytest.xml

- name: Upload code coverage to Codecov
uses: codecov/codecov-action@v1
uses: codecov/codecov-action@v2.0.2
with:
file: ./coverage.xml
flags: unittests
env_vars: RUNNER_OS,PYTHON_VERSION
name: codecov-umbrella
fail_ci_if_error: false

publish-test-results:
needs: test
runs-on: ubuntu-latest
# the build-and-test job might be skipped, we don't need to run this job then
if: success() || failure()

steps:
- name: Download Artifacts
uses: actions/download-artifact@v2
with:
path: test-results

- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@v1
with:
files: test-results/**/*.xml
2 changes: 1 addition & 1 deletion .github/workflows/parse_logs.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ def extract_short_test_summary_info(lines):
)
up_to_section_content = itertools.islice(up_to_start_of_section, 1, None)
section_content = itertools.takewhile(
lambda l: l.startswith("FAILED"), up_to_section_content
lambda l: l.startswith("FAILED") or l.startswith("ERROR"), up_to_section_content
)
content = "\n".join(section_content)

Expand Down
Loading

0 comments on commit 7337ddd

Please sign in to comment.