Skip to content

Commit 3407439

Browse files
committed
Merge remote-tracking branch 'upstream/main' into apidoc
* upstream/main: (29 commits) review actions update .git-blame-ignore-revs adopt codespell Adopt sphinx design (SciTools#5127) Bump scitools/workflows from 2023.04.2 to 2023.04.3 (SciTools#5253) refresh manual pypi publish instructions (SciTools#5252) Updated environment lockfiles (SciTools#5250) removed bugfix section Make bm_runner location agnostic and include debugging. (SciTools#5247) Restore latest Whats New files. SciTools#5220 typo github.repository_owner. (SciTools#5248) Whats new updates for v3.5.0rc0. (SciTools#5246) libnetcdf <4.9 pin (SciTools#5242) update cf standard units (SciTools#5244) Updated environment lockfiles (SciTools#5211) update ci locks location (SciTools#5228) Fixes to _discontiguity_in_bounds (attempt 2) (SciTools#4975) Finalises Lazy Data documentation (SciTools#5137) Modernize and simplify iris.analysis._Groupby (SciTools#5015) clarity on whatsnew entry contributors (SciTools#5240) ...
2 parents 0850fc4 + 2eac400 commit 3407439

File tree

685 files changed

+29459
-25317
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

685 files changed

+29459
-25317
lines changed

.git-blame-ignore-revs

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
# Format: numpy array format (#5235)
2+
c18dcd8dafef0cc7bbbf80dfce66f76a46ce59c5
3+
4+
# style: flake8 (#3755)
5+
7c86bc0168684345dc475457b1a77dadc77ce9bb
6+
7+
# style: black (#3518)
8+
ffcfad475e0593e1e40895453cf1df154e5f6f2c
9+
10+
# style: isort (#4174)
11+
15bbcc5ac3d539cb6e820148b66e7cf55d91c5d2
12+
13+
# style: blacken-docs (#4205)
14+
1572e180243e492d8ff76fa8cdefb82ef6f90415
15+
16+
# style: sort-all (#4353)
17+
64705dbc40881233aae45f051d96049150369e53
18+
19+
# style: codespell (#5186)
20+
417aa6bbd9b10d25cad7def54d47ef4d718bc38d

.github/workflows/ci-tests.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ jobs:
6363
CACHE_WEEKS: 2
6464
run: |
6565
echo "CACHE_PERIOD=$(date +%Y).$(expr $(date +%U) / ${CACHE_WEEKS})" >> ${GITHUB_ENV}
66-
echo "LOCK_FILE=requirements/ci/nox.lock/py$(echo ${{ matrix.python-version }} | tr -d '.')-linux-64.lock" >> ${GITHUB_ENV}
66+
echo "LOCK_FILE=requirements/locks/py$(echo ${{ matrix.python-version }} | tr -d '.')-linux-64.lock" >> ${GITHUB_ENV}
6767
6868
- name: "data cache"
6969
uses: ./.github/workflows/composite/iris-data-cache
@@ -111,7 +111,7 @@ jobs:
111111
- name: "nox cache"
112112
uses: ./.github/workflows/composite/nox-cache
113113
with:
114-
cache_build: 1
114+
cache_build: 2
115115
env_name: ${{ env.ENV_NAME }}
116116
lock_file: ${{ env.LOCK_FILE }}
117117

.github/workflows/ci-wheels.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ jobs:
7474
CACHE_WEEKS: 2
7575
run: |
7676
echo "CACHE_PERIOD=$(date +%Y).$(expr $(date +%U) / ${CACHE_WEEKS})" >> ${GITHUB_ENV}
77-
echo "LOCK_FILE=requirements/ci/nox.lock/py$(echo ${{ matrix.python-version }} | tr -d '.')-linux-64.lock" >> ${GITHUB_ENV}
77+
echo "LOCK_FILE=requirements/locks/py$(echo ${{ matrix.python-version }} | tr -d '.')-linux-64.lock" >> ${GITHUB_ENV}
7878
7979
- name: "conda package cache"
8080
uses: ./.github/workflows/composite/conda-pkg-cache
@@ -103,7 +103,7 @@ jobs:
103103
- name: "nox cache"
104104
uses: ./.github/workflows/composite/nox-cache
105105
with:
106-
cache_build: 0
106+
cache_build: 1
107107
env_name: ${{ env.ENV_NAME }}
108108
lock_file: ${{ env.LOCK_FILE }}
109109

@@ -133,7 +133,7 @@ jobs:
133133
runs-on: ubuntu-latest
134134
# upload to Test PyPI for every commit on main branch
135135
# and check for the SciTools repo
136-
if: github.event_name == 'push' && github.event.ref == 'refs/heads/main' && github.repository-owner == 'SciTools'
136+
if: github.event_name == 'push' && github.event.ref == 'refs/heads/main' && github.repository_owner == 'SciTools'
137137
steps:
138138
- uses: actions/download-artifact@v3
139139
with:
@@ -153,7 +153,7 @@ jobs:
153153
name: "publish to pypi"
154154
runs-on: ubuntu-latest
155155
# upload to PyPI for every tag starting with 'v'
156-
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags/v') && github.repository-owner == 'SciTools'
156+
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags/v') && github.repository_owner == 'SciTools'
157157
steps:
158158
- uses: actions/download-artifact@v3
159159
with:

.github/workflows/refresh-lockfiles.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,5 +14,5 @@ on:
1414

1515
jobs:
1616
refresh_lockfiles:
17-
uses: scitools/workflows/.github/workflows/refresh-lockfiles.yml@2023.04.1
17+
uses: scitools/workflows/.github/workflows/refresh-lockfiles.yml@2023.04.3
1818
secrets: inherit

.pre-commit-config.yaml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,13 @@ repos:
2828
# Don't commit to main branch.
2929
- id: no-commit-to-branch
3030

31+
- repo: https://github.com/codespell-project/codespell
32+
rev: "v2.2.2"
33+
hooks:
34+
- id: codespell
35+
types_or: [asciidoc, python, markdown, rst]
36+
additional_dependencies: [tomli]
37+
3138
- repo: https://github.com/psf/black
3239
rev: 23.3.0
3340
hooks:

benchmarks/benchmarks/experimental/ugrid/regions_combine.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ def _make_region_cubes(self, full_mesh_cube):
5050
i_faces = np.concatenate([i_faces[:, 2:], i_faces[:, :2]], axis=1)
5151
# flatten to get [2 3 4 0 1 (-) 8 9 10 6 7 (-) 13 14 15 11 12 ...]
5252
i_faces = i_faces.flatten()
53-
# reduce back to orignal length, wrap any overflows into valid range
53+
# reduce back to original length, wrap any overflows into valid range
5454
i_faces = i_faces[:n_faces] % n_faces
5555

5656
# Divide into regions -- always slightly uneven, since 7 doesn't divide

benchmarks/benchmarks/sperf/combine_regions.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ def _make_region_cubes(self, full_mesh_cube):
4646
i_faces = np.concatenate([i_faces[:, 2:], i_faces[:, :2]], axis=1)
4747
# flatten to get [2 3 4 0 1 (-) 8 9 10 6 7 (-) 13 14 15 11 12 ...]
4848
i_faces = i_faces.flatten()
49-
# reduce back to orignal length, wrap any overflows into valid range
49+
# reduce back to original length, wrap any overflows into valid range
5050
i_faces = i_faces[:n_faces] % n_faces
5151

5252
# Divide into regions -- always slightly uneven, since 7 doesn't divide

benchmarks/bm_runner.py

Lines changed: 31 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -23,14 +23,26 @@
2323
# for more.
2424
COMPARE_FACTOR = 1.2
2525

26+
BENCHMARKS_DIR = Path(__file__).parent
2627

2728
# Common ASV arguments for all run_types except `custom`.
2829
ASV_HARNESS = (
29-
"asv run {posargs} --attribute rounds=4 --interleave-rounds --strict "
30+
"run {posargs} --attribute rounds=4 --interleave-rounds --strict "
3031
"--show-stderr"
3132
)
3233

3334

35+
def _subprocess_run_print(args, **kwargs):
36+
print(f"BM_RUNNER DEBUG: {' '.join(args)}")
37+
return subprocess.run(args, **kwargs)
38+
39+
40+
def _subprocess_run_asv(args, **kwargs):
41+
args.insert(0, "asv")
42+
kwargs["cwd"] = BENCHMARKS_DIR
43+
return _subprocess_run_print(args, **kwargs)
44+
45+
3446
def _check_requirements(package: str) -> None:
3547
try:
3648
import_module(package)
@@ -47,7 +59,7 @@ def _prep_data_gen_env() -> None:
4759
Create/access a separate, unchanging environment for generating test data.
4860
"""
4961

50-
root_dir = Path(__file__).parents[1]
62+
root_dir = BENCHMARKS_DIR.parent
5163
python_version = "3.10"
5264
data_gen_var = "DATA_GEN_PYTHON"
5365
if data_gen_var in environ:
@@ -56,7 +68,7 @@ def _prep_data_gen_env() -> None:
5668
print("Setting up the data generation environment ...")
5769
# Get Nox to build an environment for the `tests` session, but don't
5870
# run the session. Will re-use a cached environment if appropriate.
59-
subprocess.run(
71+
_subprocess_run_print(
6072
[
6173
"nox",
6274
f"--noxfile={root_dir / 'noxfile.py'}",
@@ -75,15 +87,15 @@ def _prep_data_gen_env() -> None:
7587
print("Installing Mule into data generation environment ...")
7688
mule_dir = data_gen_python.parents[1] / "resources" / "mule"
7789
if not mule_dir.is_dir():
78-
subprocess.run(
90+
_subprocess_run_print(
7991
[
8092
"git",
8193
"clone",
8294
"https://github.com/metomi/mule.git",
8395
str(mule_dir),
8496
]
8597
)
86-
subprocess.run(
98+
_subprocess_run_print(
8799
[
88100
str(data_gen_python),
89101
"-m",
@@ -103,28 +115,28 @@ def _setup_common() -> None:
103115
_prep_data_gen_env()
104116

105117
print("Setting up ASV ...")
106-
subprocess.run(["asv", "machine", "--yes"])
118+
_subprocess_run_asv(["machine", "--yes"])
107119

108120
print("Setup complete.")
109121

110122

111123
def _asv_compare(*commits: str, overnight_mode: bool = False) -> None:
112124
"""Run through a list of commits comparing each one to the next."""
113125
commits = [commit[:8] for commit in commits]
114-
shifts_dir = Path(".asv") / "performance-shifts"
126+
shifts_dir = BENCHMARKS_DIR / ".asv" / "performance-shifts"
115127
for i in range(len(commits) - 1):
116128
before = commits[i]
117129
after = commits[i + 1]
118130
asv_command = (
119-
f"asv compare {before} {after} --factor={COMPARE_FACTOR} --split"
131+
f"compare {before} {after} --factor={COMPARE_FACTOR} --split"
120132
)
121-
subprocess.run(asv_command.split(" "))
133+
_subprocess_run_asv(asv_command.split(" "))
122134

123135
if overnight_mode:
124136
# Record performance shifts.
125137
# Run the command again but limited to only showing performance
126138
# shifts.
127-
shifts = subprocess.run(
139+
shifts = _subprocess_run_asv(
128140
[*asv_command.split(" "), "--only-changed"],
129141
capture_output=True,
130142
text=True,
@@ -207,11 +219,11 @@ def func(args: argparse.Namespace) -> None:
207219

208220
commit_range = f"{args.first_commit}^^.."
209221
asv_command = ASV_HARNESS.format(posargs=commit_range)
210-
subprocess.run([*asv_command.split(" "), *args.asv_args])
222+
_subprocess_run_asv([*asv_command.split(" "), *args.asv_args])
211223

212224
# git rev-list --first-parent is the command ASV uses.
213225
git_command = f"git rev-list --first-parent {commit_range}"
214-
commit_string = subprocess.run(
226+
commit_string = _subprocess_run_print(
215227
git_command.split(" "), capture_output=True, text=True
216228
).stdout
217229
commit_list = commit_string.rstrip().split("\n")
@@ -246,7 +258,7 @@ def func(args: argparse.Namespace) -> None:
246258
_setup_common()
247259

248260
git_command = f"git merge-base HEAD {args.base_branch}"
249-
merge_base = subprocess.run(
261+
merge_base = _subprocess_run_print(
250262
git_command.split(" "), capture_output=True, text=True
251263
).stdout[:8]
252264

@@ -255,7 +267,7 @@ def func(args: argparse.Namespace) -> None:
255267
hashfile.flush()
256268
commit_range = f"HASHFILE:{hashfile.name}"
257269
asv_command = ASV_HARNESS.format(posargs=commit_range)
258-
subprocess.run([*asv_command.split(" "), *args.asv_args])
270+
_subprocess_run_asv([*asv_command.split(" "), *args.asv_args])
259271

260272
_asv_compare(merge_base, "HEAD")
261273

@@ -312,13 +324,13 @@ def csperf(
312324
asv_command = asv_command.replace(" --strict", "")
313325
# Only do a single round.
314326
asv_command = re.sub(r"rounds=\d", "rounds=1", asv_command)
315-
subprocess.run([*asv_command.split(" "), *args.asv_args])
327+
_subprocess_run_asv([*asv_command.split(" "), *args.asv_args])
316328

317-
asv_command = f"asv publish {commit_range} --html-dir={publish_subdir}"
318-
subprocess.run(asv_command.split(" "))
329+
asv_command = f"publish {commit_range} --html-dir={publish_subdir}"
330+
_subprocess_run_asv(asv_command.split(" "))
319331

320332
# Print completion message.
321-
location = Path().cwd() / ".asv"
333+
location = BENCHMARKS_DIR / ".asv"
322334
print(
323335
f'New ASV results for "{run_type}".\n'
324336
f'See "{publish_subdir}",'
@@ -366,7 +378,7 @@ def add_arguments(self) -> None:
366378
@staticmethod
367379
def func(args: argparse.Namespace) -> None:
368380
_setup_common()
369-
subprocess.run(["asv", args.asv_sub_command, *args.asv_args])
381+
_subprocess_run_asv([args.asv_sub_command, *args.asv_args])
370382

371383

372384
def main():

codecov.yml

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
coverage:
2+
# see https://docs.codecov.com/docs/commit-status
3+
status:
4+
project:
5+
default:
6+
target: auto
7+
# coverage can drop by up to <threshold>% while still posting success
8+
threshold: 3%
9+
patch: off

docs/gallery_code/general/plot_projections_and_annotations.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@ def make_plot(projection_name, projection_crs):
7878
y_points = y_lower + y_delta * np.concatenate(
7979
(zeros, steps, ones, steps[::-1])
8080
)
81-
# Get the Iris coordinate sytem of the X coordinate (Y should be the same).
81+
# Get the Iris coordinate system of the X coordinate (Y should be the same).
8282
cs_data1 = x_coord.coord_system
8383
# Construct an equivalent Cartopy coordinate reference system ("crs").
8484
crs_data1 = cs_data1.as_cartopy_crs()

0 commit comments

Comments
 (0)