Conversation
|
I am super confused... These errors should be ignored I think: Lines 232 to 233 in 110c857 Also I get other errors when running xarray/core/pycompat.py:10: error: Skipping analyzing 'dask.array': found module but no type hints or library stubsSomehow it does not recognize the wildcard - don't you get this? Lines 182 to 183 in 110c857 (this is with mypy 0.800 and numpy 1.19)
edit: not true, see below On a related note: I think we should we add a full mypy run again? pre-commit only checks the files that are changed? |
|
Sorry, I was wrong about (2) that has to do with numpy edit: see #4878 |
|
I do get the errors if I run |
|
Thanks, maybe we have to wait for mypy 0.810 then (or however they count). I am still a bit confused why this does not fail for numpy 1.20, though... or does this not install numpy etc? |
…_and_bounds_as_coords * upstream/master: (51 commits) Ensure maximum accuracy when encoding and decoding cftime.datetime values (pydata#4758) Fix `bounds_error=True` ignored with 1D interpolation (pydata#4855) add a drop_conflicts strategy for merging attrs (pydata#4827) update pre-commit hooks (mypy) (pydata#4883) ensure warnings cannot become errors in assert_ (pydata#4864) update pre-commit hooks (pydata#4874) small fixes for the docstrings of swap_dims and integrate (pydata#4867) Modify _encode_datetime_with_cftime for compatibility with cftime > 1.4.0 (pydata#4871) vélin (pydata#4872) don't skip the doctests CI (pydata#4869) fix da.pad example for numpy 1.20 (pydata#4865) temporarily pin dask (pydata#4873) Add units if "unit" is in the attrs. (pydata#4850) speed up the repr for big MultiIndex objects (pydata#4846) dim -> coord in DataArray.integrate (pydata#3993) WIP: backend interface, now it uses subclassing (pydata#4836) weighted: small improvements (pydata#4818) Update related-projects.rst (pydata#4844) iris update doc url (pydata#4845) Faster unstacking (pydata#4746) ...
* upstream/master: (24 commits) Compatibility with dask 2021.02.0 (pydata#4884) Ensure maximum accuracy when encoding and decoding cftime.datetime values (pydata#4758) Fix `bounds_error=True` ignored with 1D interpolation (pydata#4855) add a drop_conflicts strategy for merging attrs (pydata#4827) update pre-commit hooks (mypy) (pydata#4883) ensure warnings cannot become errors in assert_ (pydata#4864) update pre-commit hooks (pydata#4874) small fixes for the docstrings of swap_dims and integrate (pydata#4867) Modify _encode_datetime_with_cftime for compatibility with cftime > 1.4.0 (pydata#4871) vélin (pydata#4872) don't skip the doctests CI (pydata#4869) fix da.pad example for numpy 1.20 (pydata#4865) temporarily pin dask (pydata#4873) Add units if "unit" is in the attrs. (pydata#4850) speed up the repr for big MultiIndex objects (pydata#4846) dim -> coord in DataArray.integrate (pydata#3993) WIP: backend interface, now it uses subclassing (pydata#4836) weighted: small improvements (pydata#4818) Update related-projects.rst (pydata#4844) iris update doc url (pydata#4845) ...
I issued a bugfix release for the issue detected in #4810 (comment), so we should use that.
pre-commit autoupdatealso tried to update tomypy=v0.800, but this fails becausemypydoes not like the redefinition ofdask_array_type(and the other*_typevariables inxarray.core.pycompat):Does anyone know how to fix that?
pre-commit run --all-files