-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pysteps.io with xarray #219
Conversation
Spontaneous questions:
|
* Fix bounding box coordinates * Add missing metadata
Ignore quality field
Spontaneous responses:
at this point I believe that "precipitation" is a good choice, but I agree that we may have other cases such "rain_rate"
Yes ... it is better to follow CF standards as xarray use them as well but I would prefer to keep key metadata in the structure as parameters of the variable ... because those metadata that initially would easily derived from the data may be lost after a number of data transformations, or they could not be that easy to derive, for example if radar field is full of non zero_values then how the zero_value can be derived? |
OK, for the moment I decided to return a DataArray instead of a Dataset (hence ignoring the quality field, which we never used anyway). This way, we remain agnostic in what variable we are reading (could be anything).
I was thinking that it would make more sense if the DataArray would include a singleton "t" dimension with the timestamp of the radar image, what you think? |
my personal preference is to use Dataset from source input as stores together multiple variables, like 'precipitation' and 'projection' information. The 'main/interset' variable always can be selected from the Dataset before to pass into pySTEPS routines. Also, "time" should be a coordinate as that is what really differentiate one radar field from other one (assuming that all of them share common metadata for all time steps) ... also using 'time' as coordinate helps to open and concatenate multiple radar files using xarray.open_mfdataset |
I see your point concerning the dataset, @cvelascof, and this could be easily implemented for importers. I just wonder the need for datasets when we mostly work with single variables (radar precip). One can always build a new dataset when needed (adding NWP data for example). And do we need an extra variable for the projection? I've seen it done before for netcdfs, but we could try to simplify this by adopting EPSG codes, at which point it would be enough to add a crs=EPSG:xxxx attribute. What do you think? also most of our methods work on single arrays, and thus the migration on xarrays would be easier on DataArrays. Also, we shouldn't make any assumption on the name of the variables. One possible solution could be to return a dataset when importing but then the user would have to pass the variable of interest to a given method, for example:
|
That could work, @dnerini (thanks for all the work, by the way)! We only should not forget to change that in the examples in that case (to avoid confusion). I find it hard to judge whether we need to move to DataArrays or -Sets. As @dnerini pointed out, we generally work with single arrays. What I can imagine with our move towards blending and perhaps later machine learning techniques, is that we want to involve (at some point..) more variables than just precipitation. If the information originates from one dataset, e.g. the NWP model, it may be a bit double to import that in multiple DataArrays that contain similar grid, time and metadata information. However, either way will work and I think we just have to decide on what we see as both a clearer and cleaner method. I personally have no preference. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work, @dnerini! I left some comments and questions, but other than that (and if we have made up our minds about the use of DataArrays or -Sets) good to go.
xsize, ysize = metadata["xpixelsize"], metadata["ypixelsize"] | ||
# x_coords = np.arange(x1, x2, xsize) + xsize / 2 | ||
# y_coords = np.arange(y1, y2, ysize) + ysize / 2 | ||
x_coords = np.arange(x1, x1 + xsize * array.shape[1], xsize) + xsize / 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This gives the same result as the commented line in line 293, right? Well, we could remove one of the options in that case. :)
In addition, just to check, this line is meant to point to the cell centers? (in that case everything is fine!)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
perhaps we should stay with DataArray for the moment for the core rutines. Said that however, xarray opens netCDF as Datasets so eventually user may need to select the variable from a dataset to run our core routines.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
About 'projection', CF conventtions stat that "A grid mapping variable may be referenced by a data variable in order to explicitly declare the coordinate reference system (CRS) used for the horizontal spatial coordinate values"
https://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.html#grid-mappings-and-projections
so if outputs are to be netCDF, it would be useful to have Datasets with both the variable and the 'grid mapping variable' (aka projection)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This gives the same result as the commented line in line 293, right? Well, we could remove one of the options in that case. :)
It should, but at the moment the commented lines do not work because some importers wrongly report the x1, x2, y1, y2 coordinates as the coordinates of the corner pixels instead of the corners themselves... but we should use the commented code (it's safer), which is why I would prefer to leave them there for the time being.
In addition, just to check, this line is meant to point to the cell centers? (in that case everything is fine!)
Exactly!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
perhaps we should stay with DataArray for the moment for the core rutines. Said that however, xarray opens netCDF as Datasets so eventually user may need to select the variable from a dataset to run our core routines.
OK, let's stick to DataArrays for now, but we can easily switch to dataset at any moment before realising V2.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
About 'projection', CF conventtions stat that "A grid mapping variable may be referenced by a data variable in order to explicitly declare the coordinate reference system (CRS) used for the horizontal spatial coordinate values"
https://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.html#grid-mappings-and-projections
so if outputs are to be netCDF, it would be useful to have Datasets with both the variable and the 'grid mapping variable' (aka projection)
Certainly very important for the netcdf exporters, perhaps less internally to the code itself?
__, __, metadata = pysteps.io.import_knmi_hdf5(filename) | ||
|
||
smart_assert(metadata[variable], expected, tolerance) | ||
smart_assert(data_array.attrs[variable], expected, tolerance) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that x, y, pixelsize, etc. is part of the DataArray and not an attribute or metadata anymore (and thus not tested as such), should the KNMI importer also have a 'geodata' that is tested here, similar to the other importers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See #218: the idea for V2 is to remove the national importers from the core library (pysteps) and refactor them as optional plugins. In this sense, the KNMI will become an external package (e.g. "pysteps-knmi-importer") that will be maintained (and tested) externally (by you I suppose ;-) )
This PR is by no means definitive, but should be seen instead as a first step to integrate the xarray data model into pysteps. I suggest we merge it pretty soon and advance in other modules, knowing that we will anyway have to come back to the io to make more adjustments before V2 will be ready. |
A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*). Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2.
Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped. This will allow incremental changes to be tested in the new v2. IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests.
Great work @dnerini! I agree that we should merge it soon and advance in other modules. I pushed two commits that add a temporary "legacy" option to allow the importers and readers to behave as in version 1. Also, the tests that are still written for v1, were updated to use this new legacy option. The tests that couldn't be easily fix were manually skipped and tagged with a TODO message. This legacy option will allow maintaining the testing functionality through the migration to xarray. |
Ah very nice @aperezhortal , thanks! Looks like tests are now failing because of missing dependencies. I'll cherry-pick commit b4c165a to fix it as soon as possible (should have some time this afternoon). |
Codecov Report
@@ Coverage Diff @@
## pysteps-v2 #219 +/- ##
==============================================
- Coverage 79.84% 71.51% -8.34%
==============================================
Files 137 139 +2
Lines 9929 9988 +59
==============================================
- Hits 7928 7143 -785
- Misses 2001 2845 +844
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
* First basic functions to implement STEPS blending * Add compute of blend means,sigmas and recompose * pysteps.io with xarray (pySTEPS#219) * Add xarray dependency * MCH importer returns an xarray Dataset * Remove plot lines * Remove import * Adapt readers to xarray format * Rewrite as more general decorator * Add missing metadata * Adapt io tests * Mrms bounding box (pySTEPS#222) * Fix bounding box coordinates * Add missing metadata * Import xarray DataArray Ignore quality field * Black * Do not hardcode metadata * Address review comments by ruben * Add a legacy option to the io functions A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*). Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2. * Fix compatibility problems with tests Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped. This will allow incremental changes to be tested in the new v2. IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests. * Update dependencies * Ignore plugins test Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> * Add blend_optical_flow * changes to steps blending procedure - weights according to adjusted BPS2006 method * changes to blending procedures - adjust weights from original BPS2006 method * Determine spatial correlation of NWP model forecast * First attempt to make correlations and thus weights lead time dependent (in progress..) * Change back to original BPS2006 blending formulation and add regression of skill values to climatological values for weights determination * Reformat code with Black * Skill score script imports climatological correlation-values from file now * Small changes to skill score script * Add skill score tests and an interface * Add skill score tests and an interface * Small change to docstring * Bom import xarray (pySTEPS#228) * Add import_bom_rf3 using xarray * Add tests to xarray version * Fix mrms importer tests * Pass **kwargs to internal functions * Add nwp_importers to read bom nwp sample data * Add bom nwp data to source file * Add tests for bom_nwp reader * Fix pystepsrc Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> * Functions to store and compute climatological weights (pySTEPS#231) * Implement the functions get_default_weights, save_weights, calc_clim_weights. These functions are used to evolve the weights in the scale- and skill-dependent blending with NWP in the STEPS blending algorithm. The current weights, based on the correlations per cascade level, are regressed towards these climatological weights in the course of the forecast. These functions save the current and compute the climatological weights (a running mean of the weights of the past n days, where typically n=30). First daily averages are stored and these are then averaged over the running window of n days. * Add tests for pysteps climatological weight io and calculations. * Add path_workdir to outputs section in pystepsrc file and use it as a default path to store/retrieve blending weights. * Minor changes to docstrings, changes to skill scores and testing scripts * Completed documentation for blending clim module, cleanup. Co-authored-by: RubenImhoff <r.o.imhoff@live.nl> Co-authored-by: Carlos Velasco <carlos.velasco@bom.gov.au> Co-authored-by: ned <daniele.nerini@meteoswiss.ch> Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> Co-authored-by: Ruben Imhoff <Ruben.Imhoff@deltares.nl> Co-authored-by: Carlos Velasco <cvelascof@gmail.com> Co-authored-by: Lesley De Cruz <lesley.decruz+git@gmail.com>
* First basic functions to implement STEPS blending * Add compute of blend means,sigmas and recompose * pysteps.io with xarray (#219) * Add xarray dependency * MCH importer returns an xarray Dataset * Remove plot lines * Remove import * Adapt readers to xarray format * Rewrite as more general decorator * Add missing metadata * Adapt io tests * Mrms bounding box (#222) * Fix bounding box coordinates * Add missing metadata * Import xarray DataArray Ignore quality field * Black * Do not hardcode metadata * Address review comments by ruben * Add a legacy option to the io functions A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*). Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2. * Fix compatibility problems with tests Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped. This will allow incremental changes to be tested in the new v2. IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests. * Update dependencies * Ignore plugins test Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> * Add blend_optical_flow * changes to steps blending procedure - weights according to adjusted BPS2006 method * changes to blending procedures - adjust weights from original BPS2006 method * Determine spatial correlation of NWP model forecast * First attempt to make correlations and thus weights lead time dependent (in progress..) * Change back to original BPS2006 blending formulation and add regression of skill values to climatological values for weights determination * Reformat code with Black * Skill score script imports climatological correlation-values from file now * Small changes to skill score script * Add skill score tests and an interface * Add skill score tests and an interface * Small change to docstring * Bom import xarray (#228) * Add import_bom_rf3 using xarray * Add tests to xarray version * Fix mrms importer tests * Pass **kwargs to internal functions * Add nwp_importers to read bom nwp sample data * Add bom nwp data to source file * Add tests for bom_nwp reader * Fix pystepsrc Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> * Functions to store and compute climatological weights (#231) * Implement the functions get_default_weights, save_weights, calc_clim_weights. These functions are used to evolve the weights in the scale- and skill-dependent blending with NWP in the STEPS blending algorithm. The current weights, based on the correlations per cascade level, are regressed towards these climatological weights in the course of the forecast. These functions save the current and compute the climatological weights (a running mean of the weights of the past n days, where typically n=30). First daily averages are stored and these are then averaged over the running window of n days. * Add tests for pysteps climatological weight io and calculations. * Add path_workdir to outputs section in pystepsrc file and use it as a default path to store/retrieve blending weights. * Minor changes to docstrings, changes to skill scores and testing scripts * Completed documentation for blending clim module, cleanup. Co-authored-by: RubenImhoff <r.o.imhoff@live.nl> * Main blending module, first steps * Add simple tests * Minor changes to tester: velocity now based on rainfall field of NWP * Add utilities to decompose, store and load NWP cascades for use in blending (#232) * First version of NWP decomposition * Added saving to netCDF * Completed functions for saving and loading decomposed NWP data * Added example files for the decomposed NWP functions * Added compatibility with numpy datetime * Use default output path_workdir for tmp files in blending/utils.py. * Update documentation of NWP decomposition functions in utils.py Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com> * Add importer for RMI NWP data (#234) Add importer for netcdf NWP data from RMI using xarrays. * Add test for RMI NWP data importer. * Add entry for RMI NWP data in pystepsrc. * Run black on everything: fix formatting. * Add KNMI Harmonie NWP netcdf importer and tests (#235) * Changes to v_models to make use of multiple timesteps. Changes in the velocity field over time in the NWP forecast will be taken into account now. * Fixes for KNMI importer: Add forgotten @postprocess_import() Don't call dropna on NWP data. * Avoid shadowing of pysteps.blending.utils by pysteps.utils * First attempt for probability matching and masking utility; part 1 * Changes to prob matching and masking methods; part 2 * Prob matching and masking changes; part 3. Ready for testing with real data from here on * Remove unnecessary print statements * Cleanup imports * More cleanup * Update docstrings * RMI importer for gallery example (will follow) * Reprojection functionality (#236) * Added Lesley's reprojection module to this branch * Added compatibility for three-dimensional xarrays * Add commentary to reprojection util * Changes to make reprojection of KNMI data possible * Changes after Daniele's review * Add dependencies * Changes to importers, see issue #215 * Add tests * Fix some issues * documentation * Fixes for tests * Set requirements again * Some fixes * Changes to nwp_importers after Carlos' response * Remove wrong example script * Remove rasterio dependencies from lists * First try to prevent testing error * Changes Daniele and fix knmi nwp importer * Add rasterio to tox.ini * Aesthetics * rasterio import test * Add rasterio to the test dependencies * Reset try-except functionality for rasterio import * Fix for failing test on windows python 3.6 * add importerskip rasterio Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> * Fixes in nwp importers * Revert "Merge branch 'steps_blending' into pysteps-v2" (#239) This reverts commit 2c639f8, reversing changes made to bccb8fc. * Merge latest version pysteps-v2 into steps_blending branch (#237) * Update docstrings * More cleanup * Cleanup imports * Cleanup imports * More cleanup * Update docstrings * Update references Mention the work of Ravuri et al (2021, Nature) as an example of work using cGANs to generate ensembles * Clean up page * Reprojection functionality (#236) * Added Lesley's reprojection module to this branch * Added compatibility for three-dimensional xarrays * Add commentary to reprojection util * Changes to make reprojection of KNMI data possible * Changes after Daniele's review * Add dependencies * Changes to importers, see issue #215 * Add tests * Fix some issues * documentation * Fixes for tests * Set requirements again * Some fixes * Changes to nwp_importers after Carlos' response * Remove wrong example script * Remove rasterio dependencies from lists * First try to prevent testing error * Changes Daniele and fix knmi nwp importer * Add rasterio to tox.ini * Aesthetics * rasterio import test * Add rasterio to the test dependencies * Reset try-except functionality for rasterio import * Fix for failing test on windows python 3.6 * add importerskip rasterio Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> * Revert "Merge branch 'steps_blending' into pysteps-v2" (#239) This reverts commit 2c639f8, reversing changes made to bccb8fc. Co-authored-by: ned <daniele.nerini@meteoswiss.ch> Co-authored-by: dnerini <daniele.nerini@gmail.com> Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> * NWP skill calculation only within radar domain * Update docs * Add example for gallery examples * Fix docstrings example * Remove additional normalization step * Fixes for the tests * update docs * changes to post-processing rainfall field and docstrings * Update contributing guidelines (#241) - Improve grammar. - Make the guide more concise. Remove unused/unnecessary rules. - Indicate more clearly which parts of the guidelines are inspired by other projects (before they were only mentioned at the end). - Change "Travis-CI" references by "GitHub Actions". * Advect noise cascade * Allow for moving domain mask of extrapolation component * minor fixes * Linear blending (#229) * Implemented linear blending function * Added example file and test * Added compatibility for NWP ensembles The PR is ready to go. Making the code xarray ready will be done in a separate PR. Co-authored-by: RubenImhoff <r.o.imhoff@live.nl> * weights calculation adjustment outside radar domain if only one model present * allow for mirroring of advected noise cascade * implementation of weights following Seed et al. (2013) * Allow for decomposed NWP precip and NWP velocity fields: part 2 * Store decomposed fields with compression * changes after first review Daniele * Remove unnecessary print statement * fixes to blending utils and implementation of blending utils tests * remove unnecessary lines * Fix one time step shift of extrapolation skill prior to blending * minor changes to blending climatology, blending weights and remove path_workdir from pystepsrc * Make NWP forecast decomposition prior to blending function optional * Use pathlib * Extract methods * Minor changes to docstrings * Access climatological skill file for multiple NWP model and date string changes to prevent errors in blending.utils Co-authored-by: Carlos Velasco <carlos.velasco@bom.gov.au> Co-authored-by: ned <daniele.nerini@meteoswiss.ch> Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> Co-authored-by: Ruben Imhoff <Ruben.Imhoff@deltares.nl> Co-authored-by: Carlos Velasco <cvelascof@gmail.com> Co-authored-by: Lesley De Cruz <lesley.decruz+git@gmail.com> Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com> Co-authored-by: Lesley De Cruz <lesley.decruz@meteo.be> Co-authored-by: dnerini <daniele.nerini@gmail.com>
* First basic functions to implement STEPS blending * Add compute of blend means,sigmas and recompose * pysteps.io with xarray (#219) * Add xarray dependency * MCH importer returns an xarray Dataset * Remove plot lines * Remove import * Adapt readers to xarray format * Rewrite as more general decorator * Add missing metadata * Adapt io tests * Mrms bounding box (#222) * Fix bounding box coordinates * Add missing metadata * Import xarray DataArray Ignore quality field * Black * Do not hardcode metadata * Address review comments by ruben * Add a legacy option to the io functions A "legacy" options is added to revert back the importers and readers behavior to version 1. This is a temporary solution to allow the examples, and other functions, to run as usual (v1.*). Hopefully, this is will allow a easier transition into version 2 during the development process and will allow testing functions that were not updated to v2. * Fix compatibility problems with tests Many of the tests were updated to use the legacy data structures (v1). The tests that still contains issues, were tagged with a TODO message and they are skipped. This will allow incremental changes to be tested in the new v2. IMPORTANT: once the v2 branch is stable, we may remove the legacy compatibility from the code base and the tests. * Update dependencies * Ignore plugins test Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> * Add blend_optical_flow * changes to steps blending procedure - weights according to adjusted BPS2006 method * changes to blending procedures - adjust weights from original BPS2006 method * Determine spatial correlation of NWP model forecast * First attempt to make correlations and thus weights lead time dependent (in progress..) * Change back to original BPS2006 blending formulation and add regression of skill values to climatological values for weights determination * Reformat code with Black * Skill score script imports climatological correlation-values from file now * Small changes to skill score script * Add skill score tests and an interface * Add skill score tests and an interface * Small change to docstring * Bom import xarray (#228) * Add import_bom_rf3 using xarray * Add tests to xarray version * Fix mrms importer tests * Pass **kwargs to internal functions * Add nwp_importers to read bom nwp sample data * Add bom nwp data to source file * Add tests for bom_nwp reader * Fix pystepsrc Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> * Functions to store and compute climatological weights (#231) * Implement the functions get_default_weights, save_weights, calc_clim_weights. These functions are used to evolve the weights in the scale- and skill-dependent blending with NWP in the STEPS blending algorithm. The current weights, based on the correlations per cascade level, are regressed towards these climatological weights in the course of the forecast. These functions save the current and compute the climatological weights (a running mean of the weights of the past n days, where typically n=30). First daily averages are stored and these are then averaged over the running window of n days. * Add tests for pysteps climatological weight io and calculations. * Add path_workdir to outputs section in pystepsrc file and use it as a default path to store/retrieve blending weights. * Minor changes to docstrings, changes to skill scores and testing scripts * Completed documentation for blending clim module, cleanup. Co-authored-by: RubenImhoff <r.o.imhoff@live.nl> * Main blending module, first steps * Add simple tests * Minor changes to tester: velocity now based on rainfall field of NWP * Add utilities to decompose, store and load NWP cascades for use in blending (#232) * First version of NWP decomposition * Added saving to netCDF * Completed functions for saving and loading decomposed NWP data * Added example files for the decomposed NWP functions * Added compatibility with numpy datetime * Use default output path_workdir for tmp files in blending/utils.py. * Update documentation of NWP decomposition functions in utils.py Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com> * Add importer for RMI NWP data (#234) Add importer for netcdf NWP data from RMI using xarrays. * Add test for RMI NWP data importer. * Add entry for RMI NWP data in pystepsrc. * Run black on everything: fix formatting. * Add KNMI Harmonie NWP netcdf importer and tests (#235) * Changes to v_models to make use of multiple timesteps. Changes in the velocity field over time in the NWP forecast will be taken into account now. * Fixes for KNMI importer: Add forgotten @postprocess_import() Don't call dropna on NWP data. * Avoid shadowing of pysteps.blending.utils by pysteps.utils * First attempt for probability matching and masking utility; part 1 * Changes to prob matching and masking methods; part 2 * Prob matching and masking changes; part 3. Ready for testing with real data from here on * Remove unnecessary print statements * Cleanup imports * More cleanup * Update docstrings * RMI importer for gallery example (will follow) * Reprojection functionality (#236) * Added Lesley's reprojection module to this branch * Added compatibility for three-dimensional xarrays * Add commentary to reprojection util * Changes to make reprojection of KNMI data possible * Changes after Daniele's review * Add dependencies * Changes to importers, see issue #215 * Add tests * Fix some issues * documentation * Fixes for tests * Set requirements again * Some fixes * Changes to nwp_importers after Carlos' response * Remove wrong example script * Remove rasterio dependencies from lists * First try to prevent testing error * Changes Daniele and fix knmi nwp importer * Add rasterio to tox.ini * Aesthetics * rasterio import test * Add rasterio to the test dependencies * Reset try-except functionality for rasterio import * Fix for failing test on windows python 3.6 * add importerskip rasterio Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> * Fixes in nwp importers * Revert "Merge branch 'steps_blending' into pysteps-v2" (#239) This reverts commit 2c639f8, reversing changes made to bccb8fc. * Merge latest version pysteps-v2 into steps_blending branch (#237) * Update docstrings * More cleanup * Cleanup imports * Cleanup imports * More cleanup * Update docstrings * Update references Mention the work of Ravuri et al (2021, Nature) as an example of work using cGANs to generate ensembles * Clean up page * Reprojection functionality (#236) * Added Lesley's reprojection module to this branch * Added compatibility for three-dimensional xarrays * Add commentary to reprojection util * Changes to make reprojection of KNMI data possible * Changes after Daniele's review * Add dependencies * Changes to importers, see issue #215 * Add tests * Fix some issues * documentation * Fixes for tests * Set requirements again * Some fixes * Changes to nwp_importers after Carlos' response * Remove wrong example script * Remove rasterio dependencies from lists * First try to prevent testing error * Changes Daniele and fix knmi nwp importer * Add rasterio to tox.ini * Aesthetics * rasterio import test * Add rasterio to the test dependencies * Reset try-except functionality for rasterio import * Fix for failing test on windows python 3.6 * add importerskip rasterio Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> * Revert "Merge branch 'steps_blending' into pysteps-v2" (#239) This reverts commit 2c639f8, reversing changes made to bccb8fc. Co-authored-by: ned <daniele.nerini@meteoswiss.ch> Co-authored-by: dnerini <daniele.nerini@gmail.com> Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> * NWP skill calculation only within radar domain * Update docs * Add example for gallery examples * Fix docstrings example * Remove additional normalization step * Fixes for the tests * update docs * changes to post-processing rainfall field and docstrings * Update contributing guidelines (#241) - Improve grammar. - Make the guide more concise. Remove unused/unnecessary rules. - Indicate more clearly which parts of the guidelines are inspired by other projects (before they were only mentioned at the end). - Change "Travis-CI" references by "GitHub Actions". * Advect noise cascade * Allow for moving domain mask of extrapolation component * minor fixes * Linear blending (#229) * Implemented linear blending function * Added example file and test * Added compatibility for NWP ensembles The PR is ready to go. Making the code xarray ready will be done in a separate PR. Co-authored-by: RubenImhoff <r.o.imhoff@live.nl> * weights calculation adjustment outside radar domain if only one model present * allow for mirroring of advected noise cascade * implementation of weights following Seed et al. (2013) * Allow for decomposed NWP precip and NWP velocity fields: part 2 * Store decomposed fields with compression * changes after first review Daniele * Remove unnecessary print statement * fixes to blending utils and implementation of blending utils tests * remove unnecessary lines * Fix one time step shift of extrapolation skill prior to blending * minor changes to blending climatology, blending weights and remove path_workdir from pystepsrc * Make NWP forecast decomposition prior to blending function optional * Use pathlib * Extract methods * Minor changes to docstrings * Access climatological skill file for multiple NWP model and date string changes to prevent errors in blending.utils Co-authored-by: Carlos Velasco <carlos.velasco@bom.gov.au> Co-authored-by: ned <daniele.nerini@meteoswiss.ch> Co-authored-by: Andres Perez Hortal <16256571+aperezhortal@users.noreply.github.com> Co-authored-by: Ruben Imhoff <Ruben.Imhoff@deltares.nl> Co-authored-by: Carlos Velasco <cvelascof@gmail.com> Co-authored-by: Lesley De Cruz <lesley.decruz+git@gmail.com> Co-authored-by: Wout Dewettinck <wout.dewettinck@ugent.be> Co-authored-by: wdewettin <87696913+wdewettin@users.noreply.github.com> Co-authored-by: Lesley De Cruz <lesley.decruz@meteo.be> Co-authored-by: dnerini <daniele.nerini@gmail.com>
Include new xarray-based data model into the pysteps.io module (See #12).
The imported data are converted into an xarray Dataset by means of a decorator.
See below for an example using MeteoSwiss data.
the same for a BOM file:
edit: limit the scope of this PR to the io module only.