Climate-Learning repo
This repository includes various routines used to analyze extreme events in climate models and reanalysis.
Below we show a composite conditioned on heatwaves in Scandinavia modelled by CESM (1000 years of data):
We are interested in predicting rare events such as heatwaves or cold spells etc.
We use neural networks to compute committor functions, conditional probability of occurrence of such events. Computations are performed on the cluster Centre Blaise Pascal at ENS de Lyon
- To install the repo to your local space you need to execute:
git clone --recursive git@github.com:georgemilosh/Climate-Learning.git
(recursive
deals with the submodule contained in this repo)
- To install the relevant packages run
setup.sh
that is included - To see how to work with our routines (such as working with data and training neural networks) consult
Plasim/tutorial.ipynb
- Another similar tutorial can be found in
CESM/CESM_tuto.ipynb
Generally the data we used in this project is quite large. However, we were able to make a portion of data available through Zenodo which contains 500 years for anomalies of
tas.nc
: 2 meter temperaturezg500.nc
: 500 hPa geopotential heightmrso.nc
: soil moisturelsmask.nc
: land sea maskgparea.nc
: cell area
For understanding our data it helps to look at the tutorial we created for Critical Earth ESR Workshop 2 held in April 2022 in Nijmegen, The Netherlands.
Where we store *.py
, *.ipynb
scripts related to the following models and methods:
- PLASIM: Intermediate complexity climate model. That's where most of our scripts including
Learn2_new.py
(responsible for trainingCNN
) are located. Also, this folder containshyperparameter_optimization.py
, a very useful Bayesian hyperparameter optimizer based onoptuna
library. - CESM: High fidelity climate model
- ERA5: ECMWF reanalysis
- SWG We store Stochastic Weather Generator
SWG
related routines in the folder calledVAE
which stands forVariational Autoencoder
experiments. Importantly this folder also contains theSWG
without the use ofVAE
.
One of the big advantages of this repository is that it easily supports customization.
The simplest way is to import Learn2_new as ln
and then simply use the features that you need. But this is hardly customization.
The second option is to leverage the full potential of the code by changing only some of its functions. Examples of this are gaussian_approx, committor_projection_NN or hyperparameter_optimization.
These modules inherit from Learn2_new
.
A template for how to properly implement this inheritance is available here
Citation:
@article{PhysRevFluids.8.040501,
title = {Probabilistic forecasts of extreme heatwaves using convolutional neural networks in a regime of lack of data},
author = {Miloshevich, George and Cozian, Bastien and Abry, Patrice and Borgnat, Pierre and Bouchet, Freddy},
journal = {Phys. Rev. Fluids},
volume = {8},
issue = {4},
pages = {040501},
numpages = {40},
year = {2023},
month = {Apr},
publisher = {American Physical Society},
doi = {10.1103/PhysRevFluids.8.040501},
url = {https://link.aps.org/doi/10.1103/PhysRevFluids.8.040501}
}