Skip to content

PyTorch implemention of the Position-induced Transformer for operator learning in partial differential equations

License

Notifications You must be signed in to change notification settings

junfeng-chen/position_induced_transformer

Repository files navigation

Position-induced Transformer

The code in this repository presents six numerical experiments of using Position-induced Transformer (PiT) for learing operators in partial differential equations. PiT is built upon the position-attention mechanism, proposed in the paper Positional Knowledge is All You Need: Position-induced Transformer (PiT) for Operator Learning. The paper can be downloaded here.

Contents

  • The numerical experiment on the one-dimensional inviscid Burgers' equation.
  • The numerical experiment on the one-dimensional compressible Euler equations.
  • The numerical experiment on the two-dimensional Darcy flow problem.
  • The numerical experiment on the two-dimensional incompressible Navier–Stokes equations.
  • The numerical experiment on the two-dimensional hyper-elastic problem.
  • The numerical experiment on the two-dimensional compressible Euler equations.

Datasets

The raw data required to reproduce the main results can be obtained from some of the baseline methods selected in our paper.

  • For InviscidBurgers and ShockTube, data sets are provided in Lanthaler et al. They can be downloaded here.
  • For Darcy2D and Vorticity, data sets are provided by Li et al. They can be downloaded here.
  • For Elasticity and NACA, data sets are provided by Li et al. They can be downloaded here.

Requirements

  • This code is primarily based on PyTorch. We have observed significant improvements in PiT's training speed with PyTorch 2.x, especially when using torch.compile. Therefore, we highly recommend using PyTorch 2.x with torch.compile enabled for optimal performance.
  • If any issues arise with torch.compile that cannot be resolved, the code is also compatible with recent versions of PyTorch 1.x. In such cases, simply comment out the line model = torch.compile(model) in the scripts.
  • Matplotlib and Scipy are also required.

Citations

@inproceedings{chen2024positional,
               title={Positional Knowledge is All You Need: Position-induced Transformer (PiT) for Operator Learning},
               author={Junfeng Chen and Kailiang Wu},
               booktitle={International conference on machine learning},
               year={2024},
               organization={PMLR}
}

About

PyTorch implemention of the Position-induced Transformer for operator learning in partial differential equations

Topics

Resources

License

Stars

Watchers

Forks

Languages