Author: J. Emmanuel Johnson
Collaborators:
Baseline Methods: Optimal Interpolation
Standard Methods: Kriging
$$ \mathbf{y} = \boldsymbol{\mu}(\mathbf{x}\phi) + \mathbf{K}\phi \left(\mathbf{y}{obs} - \boldsymbol{\mu}(\mathbf{X}\phi)\right) $$
Modern Methods: Implicit Neural Representations
$$ \mathbf{y}{obs} = \boldsymbol{f}(\mathbf{x}{\phi};\boldsymbol{\theta}) $$
Simulated Altimetry Tracks | Simulated SSH Field |
---|---|
For more information, see the experiment page.
Altimetry Tracks | SSH Field |
---|---|
For more information, see the experiment page.
Image Regression (Jupyter Notebook)
A standard image regression problem on a fox image. This is the same experiment as the demo in Tancik et al (2020)
Image Regression + Physics Loss (Jupyter Notebook) (TODO)
The standard image regression problem with the physics informed loss function, i.e. the Poisson constraint (gradient, laplacian). This is the same experiment as the Siren paper Sitzmann et al (2020)
QG Simulation (Jupyter Notebook)
This uses a subset of the QG simulations to demonstrate how each of the networks perform. This application is useful for training INR as potential mesh-free surrogates.
QG Simulation + Physics Loss (Jupyter Notebook)
This uses a subset of the QG simulations to demonstrate how each of the networks perform along with the physics-informed QG loss function.
conda env create -f environments/torch_linux.yaml
- 1.5 Layer QG Simulations
94MB
- SSH Data Challenge 2021a
- Train/Test Data -
116MB
- Results: BASELINE - ~
15MB
; DUACS - ~4.5MB
- Train/Test Data -
- SSH Data Challenge 2020b (TODO)
- SSH 5 Year Altimetry Tracks (TODO)
Step 1: Go into data folder
cd data
Step 2: Give permissions
chmod +x dl_dc21a.sh
Step 3: Download data (bash or python)
See the detailed steps below.
Run the bash script directly from the command line
bash dl_dc21a.sh username password path/to/save/dir
Create a .yaml
file. You can even append it to your already lon .yaml
file.
aviso:
username: username
password: password
Download with the python script
python dl_dc21a.py --credentials-file credentials.yaml --save-dir path/to/save/dir
I have included some environment files for the new M1 MacOS. This is because I personally use an M1 Macbook and I wanted to test out the new PyTorch M1 compatability which makes use of the M1 GPU. I personally found that the training and inference time for using PyTorch are much faster. This coincides with other users experiences (e.g. here) In addition, Anaconda claims that other packages potentially get a 20 % speedup. To install, use the requirement file:
mamba env create -f environments/torch_macos.yaml
Differences:
- The training scripts use the
skorch
distribution. This is because it takes advantage of theM1
GPU and I have seen a substantial speed-up. - A different environment file, i.e.
torch_macos.yaml
.
I cannot get datashader to work for the M1. But using the new Anaconda distribution works fine.
mamba create -n anaconda
mamba install anaconda=2022.05
- ocean-data-challenges/2021a_SSH_mapping_OSE - Altimetry SSH datasets
- Jordi Bolibar
- Quentin Favre
- Jean-Michel Brankart
- Pierre Brasseur
- hrkz/torchqg - Quasi-geostrophic spectral solver in PyTorch
- lucidrains/siren-pytorch - Siren PyTorch Model
- kklemon/gon-pytorch - Fourier Features Network Model
- didriknielsen/survae_flows - Activation Functions & Conditional Distributions
- boschresearch/multiplicative-filter-networks - Multiplicative Filter Networks (Fourier, Gabor) Models
- vsitzmann/siren - simple differential operators
- boschresearch/torchphysics - Advanced differential operators