The code in this repository presents six numerical experiments of using Position-induced Transformer (PiT) for learing operators in partial differential equations. PiT is built upon the position-attention mechanism, proposed in the paper Positional Knowledge is All You Need: Position-induced Transformer (PiT) for Operator Learning. The paper can be downloaded here.
- The numerical experiment on the one-dimensional inviscid Burgers' equation.
- The numerical experiment on the one-dimensional compressible Euler equations.
- The numerical experiment on the two-dimensional Darcy flow problem.
- The numerical experiment on the two-dimensional incompressible Navier–Stokes equations.
- The numerical experiment on the two-dimensional hyper-elastic problem.
- The numerical experiment on the two-dimensional compressible Euler equations.
The raw data required to reproduce the main results can be obtained from some of the baseline methods selected in our paper.
- For InviscidBurgers and ShockTube, data sets are provided in Lanthaler et al. They can be downloaded here.
- For Darcy2D and Vorticity, data sets are provided by Li et al. They can be downloaded here.
- For Elasticity and NACA, data sets are provided by Li et al. They can be downloaded here.
- This code is primarily based on PyTorch. We have observed significant improvements in PiT's training speed with PyTorch 2.x, especially when using
torch.compile
. Therefore, we highly recommend using PyTorch 2.x withtorch.compile
enabled for optimal performance. - If any issues arise with
torch.compile
that cannot be resolved, the code is also compatible with recent versions of PyTorch 1.x. In such cases, simply comment out the linemodel = torch.compile(model)
in the scripts. - Matplotlib and Scipy are also required.
@inproceedings{chen2024positional,
title={Positional Knowledge is All You Need: Position-induced Transformer (PiT) for Operator Learning},
author={Junfeng Chen and Kailiang Wu},
booktitle={International conference on machine learning},
year={2024},
organization={PMLR}
}