Skip to content

Fix Pydap tests for numpy 2.3.0 changes (scalar string to unicode) #10421

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 13, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ Deprecations

Bug fixes
~~~~~~~~~
- Fix Pydap test_cmp_local_file for numpy 2.3.0 changes, 1. do always return arrays for all versions and 2. skip astype(str) for numpy >= 2.3.0 for expected data. (:pull:`10421`)
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.


Documentation
Expand Down
15 changes: 10 additions & 5 deletions xarray/tests/test_backends.py
Original file line number Diff line number Diff line change
Expand Up @@ -2274,7 +2274,7 @@
# Flaky test. Very open to contributions on fixing this
@pytest.mark.flaky
def test_roundtrip_coordinates(self) -> None:
super().test_roundtrip_coordinates()

Check failure on line 2277 in xarray/tests/test_backends.py

View workflow job for this annotation

GitHub Actions / ubuntu-latest py3.13 flaky

TestNetCDF4ViaDaskData.test_roundtrip_coordinates Failed: Timeout (>180.0s) from pytest-timeout.

@requires_cftime
def test_roundtrip_cftime_bnds(self):
Expand Down Expand Up @@ -5305,7 +5305,7 @@
def test_dask_roundtrip(self) -> None:
with create_tmp_file() as tmp:
data = create_test_data()
data.to_netcdf(tmp)

Check failure on line 5308 in xarray/tests/test_backends.py

View workflow job for this annotation

GitHub Actions / ubuntu-latest py3.13 flaky

TestDask.test_dask_roundtrip Failed: Timeout (>180.0s) from pytest-timeout.
chunks = {"dim1": 4, "dim2": 4, "dim3": 4, "time": 10}
with open_dataset(tmp, chunks=chunks) as dask_ds:
assert_identical(data, dask_ds)
Expand Down Expand Up @@ -5417,11 +5417,12 @@
@contextlib.contextmanager
def create_datasets(self, **kwargs):
with open_example_dataset("bears.nc") as expected:
# print("QQ0:", expected["bears"].load())
pydap_ds = self.convert_to_pydap_dataset(expected)
actual = open_dataset(PydapDataStore(pydap_ds))
# TODO solve this workaround:
# netcdf converts string to byte not unicode
expected["bears"] = expected["bears"].astype(str)
if Version(np.__version__) < Version("2.3.0"):
# netcdf converts string to byte not unicode
expected["bears"] = expected["bears"].astype(str)
yield actual, expected

def test_cmp_local_file(self) -> None:
Expand All @@ -5441,7 +5442,9 @@
assert_equal(actual[{"l": 2}], expected[{"l": 2}])

with self.create_datasets() as (actual, expected):
assert_equal(actual.isel(i=0, j=-1), expected.isel(i=0, j=-1))
# always return arrays and not scalars
# scalars will be promoted to unicode for numpy >= 2.3.0
assert_equal(actual.isel(i=[0], j=[-1]), expected.isel(i=[0], j=[-1]))

with self.create_datasets() as (actual, expected):
assert_equal(actual.isel(j=slice(1, 2)), expected.isel(j=slice(1, 2)))
Expand All @@ -5463,7 +5466,9 @@
with create_tmp_file() as tmp_file:
actual.to_netcdf(tmp_file)
with open_dataset(tmp_file) as actual2:
actual2["bears"] = actual2["bears"].astype(str)
if Version(np.__version__) < Version("2.3.0"):
# netcdf converts string to byte not unicode
actual2["bears"] = actual2["bears"].astype(str)
assert_equal(actual2, expected)

@requires_dask
Expand Down
Loading