Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pysteps.io with xarray #219

Merged
merged 17 commits into from
Aug 4, 2021
Merged

pysteps.io with xarray #219

merged 17 commits into from
Aug 4, 2021

Conversation

dnerini
Copy link
Member

@dnerini dnerini commented Jul 26, 2021

Include new xarray-based data model into the pysteps.io module (See #12).

The imported data are converted into an xarray Dataset by means of a decorator.

See below for an example using MeteoSwiss data.

> precip_ds = pysteps.io.import_mch_gif(filename, "AQC", 5.0)
> precip_ds.info()

xarray.Dataset {
dimensions:
        y = 640 ;
        x = 710 ;

variables:
        float64 precipitation(y, x) ;
                precipitation:standard_name = precipitation_rate ;
                precipitation:long_name = Precipitation product ;
                precipitation:product = AQC ;
                precipitation:unit = mm ;
                precipitation:accutime = 5.0 ;
                precipitation:transform = None ;
                precipitation:zerovalue = 0.0 ;
                precipitation:threshold = 0.0009628129986471908 ;
                precipitation:zr_a = 316.0 ;
                precipitation:zr_b = 1.5 ;
        float64 x(x) ;
                x:standard_name = projection_x_coordinate ;
                x:units = m ;
        float64 y(y) ;
                y:standard_name = projection_y_coordinate ;
                y:units = m ;

// global attributes:
        :institution = MeteoSwiss ;
        :projection = +proj=somerc  +lon_0=7.43958333333333 +lat_0=46.9524055555556 +k_0=1 +x_0=600000 +y_0=200000 +ellps=bessel +towgs84=674.374,15.056,405.346,0,0,0,0 +units=m +no_defs ;
}

the same for a BOM file:

xarray.Dataset {
dimensions:
        y = 512 ;
        x = 512 ;

variables:
        float64 precipitation(y, x) ;
                precipitation:standard_name = precipitation_rate ;
                precipitation:long_name = Precipitation product ;
                precipitation:product = None ;
                precipitation:unit = mm ;
                precipitation:accutime = 6 ;
                precipitation:transform = None ;
                precipitation:zerovalue = 0.0 ;
                precipitation:threshold = 0.05 ;
                precipitation:zr_a = None ;
                precipitation:zr_b = None ;
        float64 x(x) ;
                x:standard_name = projection_x_coordinate ;
                x:units = m ;
        float64 y(y) ;
                y:standard_name = projection_y_coordinate ;
                y:units = m ;

// global attributes:
        :institution = Commonwealth of Australia, Bureau of Meteorology ;
        :projection = +proj=aea  +lon_0=144.752 +lat_0=-37.852 +lat_1=-18.000 +lat_2=-36.000 ;
}

edit: limit the scope of this PR to the io module only.

@dnerini dnerini self-assigned this Jul 26, 2021
@dnerini dnerini changed the base branch from master to pysteps-v2 July 26, 2021 11:25
@dnerini dnerini changed the title Implement xarray data model pysteps.io with xarray Jul 26, 2021
@dnerini
Copy link
Member Author

dnerini commented Jul 26, 2021

Spontaneous questions:

  • should we use "precipitation" to name the main variable? Can we make it more general so that to make it clear that other variables can be used (e.g. cloud cover)?
  • we should revise the metadata to better align them to CF standards and other libraries used for radar data processing (e.g. wradlib).
    • "units" instead of "unit"
    • remove any metadata that can be easily derived from the data (e.g. threshold, zerovalue)
    • ...

@dnerini dnerini marked this pull request as ready for review July 26, 2021 13:54
* Fix bounding box coordinates

* Add missing metadata
Ignore quality field
@cvelascof
Copy link
Member

Spontaneous responses:

* should we use "precipitation" to name the main variable? Can we make it more general so that to make it clear that other variables can be used (e.g. cloud cover)?

at this point I believe that "precipitation" is a good choice, but I agree that we may have other cases such "rain_rate"

* we should revise the metadata to better align them to CF standards and other libraries used for radar data processing (e.g. wradlib).

Yes ... it is better to follow CF standards as xarray use them as well

but I would prefer to keep key metadata in the structure as parameters of the variable ... because those metadata that initially would easily derived from the data may be lost after a number of data transformations, or they could not be that easy to derive, for example if radar field is full of non zero_values then how the zero_value can be derived?

@dnerini
Copy link
Member Author

dnerini commented Jul 28, 2021

OK, for the moment I decided to return a DataArray instead of a Dataset (hence ignoring the quality field, which we never used anyway). This way, we remain agnostic in what variable we are reading (could be anything).

 <xarray.DataArray (y: 640, x: 710)>
array([[nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan],
       ...,
       [nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan],
       [nan, nan, nan, ..., nan, nan, nan]])
Coordinates:
  * x        (x) float64 2.555e+05 2.565e+05 2.575e+05 ... 9.635e+05 9.645e+05
  * y        (y) float64 -1.595e+05 -1.585e+05 ... 4.785e+05 4.795e+05
Attributes:
    unit:         mm
    accutime:     5.0
    transform:    None
    zerovalue:    0.0
    threshold:    0.0009628129986471908
    zr_a:         316.0
    zr_b:         1.5
    institution:  MeteoSwiss
    projection:   +proj=somerc  +lon_0=7.43958333333333 +lat_0=46.95240555555...

I was thinking that it would make more sense if the DataArray would include a singleton "t" dimension with the timestamp of the radar image, what you think?

@cvelascof
Copy link
Member

my personal preference is to use Dataset from source input as stores together multiple variables, like 'precipitation' and 'projection' information. The 'main/interset' variable always can be selected from the Dataset before to pass into pySTEPS routines.

Also, "time" should be a coordinate as that is what really differentiate one radar field from other one (assuming that all of them share common metadata for all time steps) ... also using 'time' as coordinate helps to open and concatenate multiple radar files using xarray.open_mfdataset

@pulkkins pulkkins self-requested a review July 28, 2021 07:51
@dnerini
Copy link
Member Author

dnerini commented Jul 29, 2021

I see your point concerning the dataset, @cvelascof, and this could be easily implemented for importers. I just wonder the need for datasets when we mostly work with single variables (radar precip). One can always build a new dataset when needed (adding NWP data for example).

And do we need an extra variable for the projection? I've seen it done before for netcdfs, but we could try to simplify this by adopting EPSG codes, at which point it would be enough to add a crs=EPSG:xxxx attribute. What do you think?

also most of our methods work on single arrays, and thus the migration on xarrays would be easier on DataArrays. Also, we shouldn't make any assumption on the name of the variables.

One possible solution could be to return a dataset when importing but then the user would have to pass the variable of interest to a given method, for example:

ds = pysteps.io.read_timeseries() # or xr.open_mfdataset()
precip = ds.precip.pysteps.to_rainrate()
motion = pysteps.motion.dense_lucaskanade(precip)

@RubenImhoff
Copy link
Contributor

That could work, @dnerini (thanks for all the work, by the way)! We only should not forget to change that in the examples in that case (to avoid confusion).
@cvelascof, does projection need to be a variable? I.e., can't it just be an attribute as in @dnerini's example?

I find it hard to judge whether we need to move to DataArrays or -Sets. As @dnerini pointed out, we generally work with single arrays. What I can imagine with our move towards blending and perhaps later machine learning techniques, is that we want to involve (at some point..) more variables than just precipitation. If the information originates from one dataset, e.g. the NWP model, it may be a bit double to import that in multiple DataArrays that contain similar grid, time and metadata information. However, either way will work and I think we just have to decide on what we see as both a clearer and cleaner method. I personally have no preference.

Copy link
Contributor

@RubenImhoff RubenImhoff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work, @dnerini! I left some comments and questions, but other than that (and if we have made up our minds about the use of DataArrays or -Sets) good to go.

xsize, ysize = metadata["xpixelsize"], metadata["ypixelsize"]
# x_coords = np.arange(x1, x2, xsize) + xsize / 2
# y_coords = np.arange(y1, y2, ysize) + ysize / 2
x_coords = np.arange(x1, x1 + xsize * array.shape[1], xsize) + xsize / 2
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This gives the same result as the commented line in line 293, right? Well, we could remove one of the options in that case. :)

In addition, just to check, this line is meant to point to the cell centers? (in that case everything is fine!)

Copy link
Member

@cvelascof cvelascof Jul 30, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps we should stay with DataArray for the moment for the core rutines. Said that however, xarray opens netCDF as Datasets so eventually user may need to select the variable from a dataset to run our core routines.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

About 'projection', CF conventtions stat that "A grid mapping variable may be referenced by a data variable in order to explicitly declare the coordinate reference system (CRS) used for the horizontal spatial coordinate values"
https://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.html#grid-mappings-and-projections
so if outputs are to be netCDF, it would be useful to have Datasets with both the variable and the 'grid mapping variable' (aka projection)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This gives the same result as the commented line in line 293, right? Well, we could remove one of the options in that case. :)

It should, but at the moment the commented lines do not work because some importers wrongly report the x1, x2, y1, y2 coordinates as the coordinates of the corner pixels instead of the corners themselves... but we should use the commented code (it's safer), which is why I would prefer to leave them there for the time being.

In addition, just to check, this line is meant to point to the cell centers? (in that case everything is fine!)
Exactly!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps we should stay with DataArray for the moment for the core rutines. Said that however, xarray opens netCDF as Datasets so eventually user may need to select the variable from a dataset to run our core routines.

OK, let's stick to DataArrays for now, but we can easily switch to dataset at any moment before realising V2.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

About 'projection', CF conventtions stat that "A grid mapping variable may be referenced by a data variable in order to explicitly declare the coordinate reference system (CRS) used for the horizontal spatial coordinate values"
https://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.html#grid-mappings-and-projections
so if outputs are to be netCDF, it would be useful to have Datasets with both the variable and the 'grid mapping variable' (aka projection)

Certainly very important for the netcdf exporters, perhaps less internally to the code itself?

pysteps/io/readers.py Outdated Show resolved Hide resolved
pysteps/tests/helpers.py Outdated Show resolved Hide resolved
pysteps/tests/helpers.py Show resolved Hide resolved
pysteps/tests/helpers.py Outdated Show resolved Hide resolved
__, __, metadata = pysteps.io.import_knmi_hdf5(filename)

smart_assert(metadata[variable], expected, tolerance)
smart_assert(data_array.attrs[variable], expected, tolerance)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now that x, y, pixelsize, etc. is part of the DataArray and not an attribute or metadata anymore (and thus not tested as such), should the KNMI importer also have a 'geodata' that is tested here, similar to the other importers?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See #218: the idea for V2 is to remove the national importers from the core library (pysteps) and refactor them as optional plugins. In this sense, the KNMI will become an external package (e.g. "pysteps-knmi-importer") that will be maintained (and tested) externally (by you I suppose ;-) )

pysteps/tests/test_io_mrms_grib.py Outdated Show resolved Hide resolved
@dnerini
Copy link
Member Author

dnerini commented Aug 1, 2021

This PR is by no means definitive, but should be seen instead as a first step to integrate the xarray data model into pysteps. I suggest we merge it pretty soon and advance in other modules, knowing that we will anyway have to come back to the io to make more adjustments before V2 will be ready.

A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*).

Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2.
Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped.

This will allow incremental changes to be tested in the new v2.

IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests.
@aperezhortal
Copy link
Member

Great work @dnerini! I agree that we should merge it soon and advance in other modules.

I pushed two commits that add a temporary "legacy" option to allow the importers and readers to behave as in version 1. Also, the tests that are still written for v1, were updated to use this new legacy option. The tests that couldn't be easily fix were manually skipped and tagged with a TODO message. This legacy option will allow maintaining the testing functionality through the migration to xarray.

@dnerini
Copy link
Member Author

dnerini commented Aug 4, 2021

Ah very nice @aperezhortal , thanks! Looks like tests are now failing because of missing dependencies. I'll cherry-pick commit b4c165a to fix it as soon as possible (should have some time this afternoon).

@codecov
Copy link

codecov bot commented Aug 4, 2021

Codecov Report

Merging #219 (15d5b5b) into pysteps-v2 (c8ebc2d) will decrease coverage by 8.33%.
The diff coverage is 96.55%.

Impacted file tree graph

@@              Coverage Diff               @@
##           pysteps-v2     #219      +/-   ##
==============================================
- Coverage       79.84%   71.51%   -8.34%     
==============================================
  Files             137      139       +2     
  Lines            9929     9988      +59     
==============================================
- Hits             7928     7143     -785     
- Misses           2001     2845     +844     
Flag Coverage Δ
unit_tests 71.51% <96.55%> (-8.34%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pysteps/io/readers.py 81.25% <72.72%> (+7.05%) ⬆️
pysteps/tests/helpers.py 87.50% <87.50%> (-6.62%) ⬇️
pysteps/decorators.py 99.28% <100.00%> (+0.14%) ⬆️
pysteps/io/importers.py 71.77% <100.00%> (+0.12%) ⬆️
pysteps/tests/test_cascade.py 47.22% <100.00%> (-52.78%) ⬇️
pysteps/tests/test_datasets.py 45.45% <100.00%> (-26.64%) ⬇️
pysteps/tests/test_detcatscores.py 94.73% <100.00%> (-5.27%) ⬇️
pysteps/tests/test_exporters.py 31.91% <100.00%> (-68.09%) ⬇️
pysteps/tests/test_io_archive.py 100.00% <100.00%> (ø)
pysteps/tests/test_io_bom_rf3.py 100.00% <100.00%> (ø)
... and 35 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c8ebc2d...15d5b5b. Read the comment docs.

@dnerini dnerini merged commit 0082738 into pysteps-v2 Aug 4, 2021
@dnerini dnerini deleted the xarray-data-model branch August 4, 2021 12:27
RubenImhoff added a commit to RubenImhoff/pysteps that referenced this pull request Aug 23, 2021
* First basic functions to implement STEPS blending

* Add compute of blend means,sigmas and recompose

* pysteps.io with xarray (pySTEPS#219)

* Add xarray dependency

* MCH importer returns an xarray Dataset

* Remove plot lines

* Remove import

* Adapt readers to xarray format

* Rewrite as more general decorator

* Add missing metadata

* Adapt io tests

* Mrms bounding box (pySTEPS#222)

* Fix bounding box coordinates

* Add missing metadata

* Import xarray DataArray

Ignore quality field

* Black

* Do not hardcode metadata

* Address review comments by ruben

* Add a legacy option to the io functions

A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*).

Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2.

* Fix compatibility problems with tests

Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped.

This will allow incremental changes to be tested in the new v2.

IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests.

* Update dependencies

* Ignore plugins test

Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>

* Add blend_optical_flow

* changes to steps blending procedure - weights according to adjusted BPS2006 method

* changes to blending procedures - adjust weights from original BPS2006 method

* Determine spatial correlation of NWP model forecast

* First attempt to make correlations and thus weights lead time dependent (in progress..)

* Change back to original BPS2006 blending formulation and add regression of skill values to climatological values for weights determination

* Reformat code with Black

* Skill score script imports climatological correlation-values from file now

* Small changes to skill score script

* Add skill score tests and an interface

* Add skill score tests and an interface

* Small change to docstring

* Bom import xarray (pySTEPS#228)

* Add import_bom_rf3  using xarray

* Add tests to xarray version

* Fix mrms importer tests

* Pass **kwargs to internal functions

* Add nwp_importers to read bom nwp sample data

* Add bom nwp data to source file

* Add tests for bom_nwp reader

* Fix pystepsrc

Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>

* Functions to store and compute climatological weights (pySTEPS#231)

* Implement the functions get_default_weights, save_weights, calc_clim_weights. 

These functions are used to evolve the weights in the scale- and skill-dependent blending with NWP in the STEPS blending algorithm. The current weights, based on the correlations per cascade level, are regressed towards these climatological weights in the course of the forecast.

These functions save the current and compute the climatological weights (a running mean of the weights of the past n days, where typically n=30). First daily averages are stored and these are then averaged over the running window of n days.

* Add tests for pysteps climatological weight io and calculations.

* Add path_workdir to outputs section in pystepsrc file and use it as a default path to store/retrieve blending weights.

* Minor changes to docstrings, changes to skill scores and testing scripts

* Completed documentation for blending clim module, cleanup.

Co-authored-by: RubenImhoff <r.o.imhoff@live.nl>

Co-authored-by: Carlos Velasco <carlos.velasco@bom.gov.au>
Co-authored-by: ned <daniele.nerini@meteoswiss.ch>
Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>
Co-authored-by: Ruben Imhoff <Ruben.Imhoff@deltares.nl>
Co-authored-by: Carlos Velasco <cvelascof@gmail.com>
Co-authored-by: Lesley De Cruz <lesley.decruz+git@gmail.com>
dnerini added a commit that referenced this pull request Jan 14, 2022
* First basic functions to implement STEPS blending

* Add compute of blend means,sigmas and recompose

* pysteps.io with xarray (#219)

* Add xarray dependency

* MCH importer returns an xarray Dataset

* Remove plot lines

* Remove import

* Adapt readers to xarray format

* Rewrite as more general decorator

* Add missing metadata

* Adapt io tests

* Mrms bounding box (#222)

* Fix bounding box coordinates

* Add missing metadata

* Import xarray DataArray

Ignore quality field

* Black

* Do not hardcode metadata

* Address review comments by ruben

* Add a legacy option to the io functions

A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*).

Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2.

* Fix compatibility problems with tests

Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped.

This will allow incremental changes to be tested in the new v2.

IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests.

* Update dependencies

* Ignore plugins test

Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>

* Add blend_optical_flow

* changes to steps blending procedure - weights according to adjusted BPS2006 method

* changes to blending procedures - adjust weights from original BPS2006 method

* Determine spatial correlation of NWP model forecast

* First attempt to make correlations and thus weights lead time dependent (in progress..)

* Change back to original BPS2006 blending formulation and add regression of skill values to climatological values for weights determination

* Reformat code with Black

* Skill score script imports climatological correlation-values from file now

* Small changes to skill score script

* Add skill score tests and an interface

* Add skill score tests and an interface

* Small change to docstring

* Bom import xarray (#228)

* Add import_bom_rf3  using xarray

* Add tests to xarray version

* Fix mrms importer tests

* Pass **kwargs to internal functions

* Add nwp_importers to read bom nwp sample data

* Add bom nwp data to source file

* Add tests for bom_nwp reader

* Fix pystepsrc

Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>

* Functions to store and compute climatological weights (#231)

* Implement the functions get_default_weights, save_weights, calc_clim_weights. 

These functions are used to evolve the weights in the scale- and skill-dependent blending with NWP in the STEPS blending algorithm. The current weights, based on the correlations per cascade level, are regressed towards these climatological weights in the course of the forecast.

These functions save the current and compute the climatological weights (a running mean of the weights of the past n days, where typically n=30). First daily averages are stored and these are then averaged over the running window of n days.

* Add tests for pysteps climatological weight io and calculations.

* Add path_workdir to outputs section in pystepsrc file and use it as a default path to store/retrieve blending weights.

* Minor changes to docstrings, changes to skill scores and testing scripts

* Completed documentation for blending clim module, cleanup.

Co-authored-by: RubenImhoff <r.o.imhoff@live.nl>

* Main blending module, first steps

* Add simple tests

* Minor changes to tester: velocity now based on rainfall field of NWP

* Add utilities to decompose, store and load NWP cascades for use in blending  (#232)

* First version of NWP decomposition

* Added saving to netCDF

* Completed functions for saving and loading decomposed NWP data

* Added example files for the decomposed NWP functions

* Added compatibility with numpy datetime

* Use default output path_workdir for tmp files in blending/utils.py.

* Update documentation of NWP decomposition functions in utils.py

Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>
Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com>

* Add importer for RMI NWP data (#234)

Add importer for netcdf NWP data from RMI using xarrays.

* Add test for RMI NWP data importer.

* Add entry for RMI NWP data in pystepsrc.

* Run black on everything: fix formatting.

* Add KNMI Harmonie NWP netcdf importer and tests (#235)

* Changes to v_models to make use of multiple timesteps. Changes in the velocity field over time in the NWP forecast will be taken into account now.

* Fixes for KNMI importer:

Add forgotten @postprocess_import()
Don't call dropna on NWP data.

* Avoid shadowing of pysteps.blending.utils by pysteps.utils

* First attempt for probability matching and masking utility; part 1

* Changes to prob matching and masking methods; part 2

* Prob matching and masking changes; part 3. Ready for testing with real data from here on

* Remove unnecessary print statements

* Cleanup imports

* More cleanup

* Update docstrings

* RMI importer for gallery example (will follow)

* Reprojection functionality (#236)

* Added Lesley's reprojection module to this branch

* Added compatibility for three-dimensional xarrays

* Add commentary to reprojection util

* Changes to make reprojection of KNMI data possible

* Changes after Daniele's review

* Add dependencies

* Changes to importers, see issue #215

* Add tests

* Fix some issues

* documentation

* Fixes for tests

* Set requirements again

* Some fixes

* Changes to nwp_importers after Carlos' response

* Remove wrong example script

* Remove rasterio dependencies from lists

* First try to prevent testing error

* Changes Daniele and fix knmi nwp importer

* Add rasterio to tox.ini

* Aesthetics

* rasterio import test

* Add rasterio to the test dependencies

* Reset try-except functionality for rasterio import

* Fix for failing test on windows python 3.6

* add importerskip rasterio

Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>

* Fixes in nwp importers

* Revert "Merge branch 'steps_blending' into pysteps-v2" (#239)

This reverts commit 2c639f8, reversing
changes made to bccb8fc.

* Merge latest version pysteps-v2 into steps_blending branch (#237)

* Update docstrings

* More cleanup

* Cleanup imports

* Cleanup imports

* More cleanup

* Update docstrings

* Update references

Mention the work of Ravuri et al (2021, Nature) as an example of work using cGANs to generate ensembles

* Clean up page

* Reprojection functionality (#236)

* Added Lesley's reprojection module to this branch

* Added compatibility for three-dimensional xarrays

* Add commentary to reprojection util

* Changes to make reprojection of KNMI data possible

* Changes after Daniele's review

* Add dependencies

* Changes to importers, see issue #215

* Add tests

* Fix some issues

* documentation

* Fixes for tests

* Set requirements again

* Some fixes

* Changes to nwp_importers after Carlos' response

* Remove wrong example script

* Remove rasterio dependencies from lists

* First try to prevent testing error

* Changes Daniele and fix knmi nwp importer

* Add rasterio to tox.ini

* Aesthetics

* rasterio import test

* Add rasterio to the test dependencies

* Reset try-except functionality for rasterio import

* Fix for failing test on windows python 3.6

* add importerskip rasterio

Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>

* Revert "Merge branch 'steps_blending' into pysteps-v2" (#239)

This reverts commit 2c639f8, reversing
changes made to bccb8fc.

Co-authored-by: ned <daniele.nerini@meteoswiss.ch>
Co-authored-by: dnerini <daniele.nerini@gmail.com>
Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>

* NWP skill calculation only within radar domain

* Update docs

* Add example for gallery examples

* Fix docstrings example

* Remove additional normalization step

* Fixes for the tests

* update docs

* changes to post-processing rainfall field and docstrings

* Update contributing guidelines (#241)

- Improve grammar.
- Make the guide more concise. Remove unused/unnecessary rules.
- Indicate more clearly which parts of the guidelines are inspired by other projects (before they were only mentioned at the end).
- Change "Travis-CI" references by "GitHub Actions".

* Advect noise cascade

* Allow for moving domain mask of extrapolation component

* minor fixes

* Linear blending (#229)

* Implemented linear blending function
* Added example file and test
* Added compatibility for NWP ensembles

The PR is ready to go. Making the code xarray ready will be done in a separate PR. 

Co-authored-by: RubenImhoff <r.o.imhoff@live.nl>

* weights calculation adjustment outside radar domain if only one model present

* allow for mirroring of advected noise cascade

* implementation of weights following Seed et al. (2013)

* Allow for decomposed NWP precip and NWP velocity fields: part 2

* Store decomposed fields with compression

* changes after first review Daniele

* Remove unnecessary print statement

* fixes to blending utils and implementation of blending utils tests

* remove unnecessary lines

* Fix one time step shift of extrapolation skill prior to blending

* minor changes to blending climatology, blending weights and remove path_workdir from pystepsrc

* Make NWP forecast decomposition prior to blending function optional

* Use pathlib

* Extract methods

* Minor changes to docstrings

* Access climatological skill file for multiple NWP model and date string changes to prevent errors in blending.utils

Co-authored-by: Carlos Velasco <carlos.velasco@bom.gov.au>
Co-authored-by: ned <daniele.nerini@meteoswiss.ch>
Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>
Co-authored-by: Ruben Imhoff <Ruben.Imhoff@deltares.nl>
Co-authored-by: Carlos Velasco <cvelascof@gmail.com>
Co-authored-by: Lesley De Cruz <lesley.decruz+git@gmail.com>
Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>
Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com>
Co-authored-by: Lesley De Cruz <lesley.decruz@meteo.be>
Co-authored-by: dnerini <daniele.nerini@gmail.com>
dnerini added a commit that referenced this pull request Jan 14, 2022
* First basic functions to implement STEPS blending

* Add compute of blend means,sigmas and recompose

* pysteps.io with xarray (#219)

* Add xarray dependency

* MCH importer returns an xarray Dataset

* Remove plot lines

* Remove import

* Adapt readers to xarray format

* Rewrite as more general decorator

* Add missing metadata

* Adapt io tests

* Mrms bounding box (#222)

* Fix bounding box coordinates

* Add missing metadata

* Import xarray DataArray

Ignore quality field

* Black

* Do not hardcode metadata

* Address review comments by ruben

* Add a legacy option to the io functions

A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*).

Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2.

* Fix compatibility problems with tests

Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped.

This will allow incremental changes to be tested in the new v2.

IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests.

* Update dependencies

* Ignore plugins test

Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>

* Add blend_optical_flow

* changes to steps blending procedure - weights according to adjusted BPS2006 method

* changes to blending procedures - adjust weights from original BPS2006 method

* Determine spatial correlation of NWP model forecast

* First attempt to make correlations and thus weights lead time dependent (in progress..)

* Change back to original BPS2006 blending formulation and add regression of skill values to climatological values for weights determination

* Reformat code with Black

* Skill score script imports climatological correlation-values from file now

* Small changes to skill score script

* Add skill score tests and an interface

* Add skill score tests and an interface

* Small change to docstring

* Bom import xarray (#228)

* Add import_bom_rf3  using xarray

* Add tests to xarray version

* Fix mrms importer tests

* Pass **kwargs to internal functions

* Add nwp_importers to read bom nwp sample data

* Add bom nwp data to source file

* Add tests for bom_nwp reader

* Fix pystepsrc

Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>

* Functions to store and compute climatological weights (#231)

* Implement the functions get_default_weights, save_weights, calc_clim_weights.

These functions are used to evolve the weights in the scale- and skill-dependent blending with NWP in the STEPS blending algorithm. The current weights, based on the correlations per cascade level, are regressed towards these climatological weights in the course of the forecast.

These functions save the current and compute the climatological weights (a running mean of the weights of the past n days, where typically n=30). First daily averages are stored and these are then averaged over the running window of n days.

* Add tests for pysteps climatological weight io and calculations.

* Add path_workdir to outputs section in pystepsrc file and use it as a default path to store/retrieve blending weights.

* Minor changes to docstrings, changes to skill scores and testing scripts

* Completed documentation for blending clim module, cleanup.

Co-authored-by: RubenImhoff <r.o.imhoff@live.nl>

* Main blending module, first steps

* Add simple tests

* Minor changes to tester: velocity now based on rainfall field of NWP

* Add utilities to decompose, store and load NWP cascades for use in blending  (#232)

* First version of NWP decomposition

* Added saving to netCDF

* Completed functions for saving and loading decomposed NWP data

* Added example files for the decomposed NWP functions

* Added compatibility with numpy datetime

* Use default output path_workdir for tmp files in blending/utils.py.

* Update documentation of NWP decomposition functions in utils.py

Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>
Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com>

* Add importer for RMI NWP data (#234)

Add importer for netcdf NWP data from RMI using xarrays.

* Add test for RMI NWP data importer.

* Add entry for RMI NWP data in pystepsrc.

* Run black on everything: fix formatting.

* Add KNMI Harmonie NWP netcdf importer and tests (#235)

* Changes to v_models to make use of multiple timesteps. Changes in the velocity field over time in the NWP forecast will be taken into account now.

* Fixes for KNMI importer:

Add forgotten @postprocess_import()
Don't call dropna on NWP data.

* Avoid shadowing of pysteps.blending.utils by pysteps.utils

* First attempt for probability matching and masking utility; part 1

* Changes to prob matching and masking methods; part 2

* Prob matching and masking changes; part 3. Ready for testing with real data from here on

* Remove unnecessary print statements

* Cleanup imports

* More cleanup

* Update docstrings

* RMI importer for gallery example (will follow)

* Reprojection functionality (#236)

* Added Lesley's reprojection module to this branch

* Added compatibility for three-dimensional xarrays

* Add commentary to reprojection util

* Changes to make reprojection of KNMI data possible

* Changes after Daniele's review

* Add dependencies

* Changes to importers, see issue #215

* Add tests

* Fix some issues

* documentation

* Fixes for tests

* Set requirements again

* Some fixes

* Changes to nwp_importers after Carlos' response

* Remove wrong example script

* Remove rasterio dependencies from lists

* First try to prevent testing error

* Changes Daniele and fix knmi nwp importer

* Add rasterio to tox.ini

* Aesthetics

* rasterio import test

* Add rasterio to the test dependencies

* Reset try-except functionality for rasterio import

* Fix for failing test on windows python 3.6

* add importerskip rasterio

Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>

* Fixes in nwp importers

* Revert "Merge branch 'steps_blending' into pysteps-v2" (#239)

This reverts commit 2c639f8, reversing
changes made to bccb8fc.

* Merge latest version pysteps-v2 into steps_blending branch (#237)

* Update docstrings

* More cleanup

* Cleanup imports

* Cleanup imports

* More cleanup

* Update docstrings

* Update references

Mention the work of Ravuri et al (2021, Nature) as an example of work using cGANs to generate ensembles

* Clean up page

* Reprojection functionality (#236)

* Added Lesley's reprojection module to this branch

* Added compatibility for three-dimensional xarrays

* Add commentary to reprojection util

* Changes to make reprojection of KNMI data possible

* Changes after Daniele's review

* Add dependencies

* Changes to importers, see issue #215

* Add tests

* Fix some issues

* documentation

* Fixes for tests

* Set requirements again

* Some fixes

* Changes to nwp_importers after Carlos' response

* Remove wrong example script

* Remove rasterio dependencies from lists

* First try to prevent testing error

* Changes Daniele and fix knmi nwp importer

* Add rasterio to tox.ini

* Aesthetics

* rasterio import test

* Add rasterio to the test dependencies

* Reset try-except functionality for rasterio import

* Fix for failing test on windows python 3.6

* add importerskip rasterio

Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>

* Revert "Merge branch 'steps_blending' into pysteps-v2" (#239)

This reverts commit 2c639f8, reversing
changes made to bccb8fc.

Co-authored-by: ned <daniele.nerini@meteoswiss.ch>
Co-authored-by: dnerini <daniele.nerini@gmail.com>
Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>

* NWP skill calculation only within radar domain

* Update docs

* Add example for gallery examples

* Fix docstrings example

* Remove additional normalization step

* Fixes for the tests

* update docs

* changes to post-processing rainfall field and docstrings

* Update contributing guidelines (#241)

- Improve grammar.
- Make the guide more concise. Remove unused/unnecessary rules.
- Indicate more clearly which parts of the guidelines are inspired by other projects (before they were only mentioned at the end).
- Change "Travis-CI" references by "GitHub Actions".

* Advect noise cascade

* Allow for moving domain mask of extrapolation component

* minor fixes

* Linear blending (#229)

* Implemented linear blending function
* Added example file and test
* Added compatibility for NWP ensembles

The PR is ready to go. Making the code xarray ready will be done in a separate PR.

Co-authored-by: RubenImhoff <r.o.imhoff@live.nl>

* weights calculation adjustment outside radar domain if only one model present

* allow for mirroring of advected noise cascade

* implementation of weights following Seed et al. (2013)

* Allow for decomposed NWP precip and NWP velocity fields: part 2

* Store decomposed fields with compression

* changes after first review Daniele

* Remove unnecessary print statement

* fixes to blending utils and implementation of blending utils tests

* remove unnecessary lines

* Fix one time step shift of extrapolation skill prior to blending

* minor changes to blending climatology, blending weights and remove path_workdir from pystepsrc

* Make NWP forecast decomposition prior to blending function optional

* Use pathlib

* Extract methods

* Minor changes to docstrings

* Access climatological skill file for multiple NWP model and date string changes to prevent errors in blending.utils

Co-authored-by: Carlos Velasco <carlos.velasco@bom.gov.au>
Co-authored-by: ned <daniele.nerini@meteoswiss.ch>
Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com>
Co-authored-by: Ruben Imhoff <Ruben.Imhoff@deltares.nl>
Co-authored-by: Carlos Velasco <cvelascof@gmail.com>
Co-authored-by: Lesley De Cruz <lesley.decruz+git@gmail.com>
Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be>
Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com>
Co-authored-by: Lesley De Cruz <lesley.decruz@meteo.be>
Co-authored-by: dnerini <daniele.nerini@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement a data model
4 participants