Skip to content

About code release of "Transolver: A Fast Transformer Solver for PDEs on General Geometries", ICML 2024 Spotlight. https://arxiv.org/abs/2402.02366

License

Notifications You must be signed in to change notification settings

thuml/Transolver

Repository files navigation

Transolver (ICML 2024 Spotlight)

🚩News (2024.10) Transolver has been integrated into NVIDIA modulus.

Transolver: A Fast Transformer Solver for PDEs on General Geometries [Paper] [Slides] [Poster]

In real-world applications, PDEs are typically discretized into large-scale meshes with complex geometries. To capture intricate physical correlations hidden under multifarious meshes, we propose the Transolver with the following features:

  • Going beyond previous work, Transolver calculates attention among learned physical states instead of mesh points, which empowers the model with endogenetic geometry-general capability.
  • Transolver achieves 22% error reduction over previous SOTA in six standard benchmarks and excels in large-scale industrial simulations, including car and airfoil designs.
  • Transolver presents favorable efficiency, scalability and out-of-distrbution generalizability.



Figure 1. Overview of Transolver.

Transolver v.s. Previous Transformer Operators

All of the previous Transformer-based neural operators directly apply attention to mesh points. However, the massive mesh points in practical applications will cause challenges in both computation cost and capturing physical correlations.

Transolver is based on a more foundational idea, that is learning intrinsic physical states under complex geometrics. This design frees our model from superficial and unwieldy meshes and focuses more on physics modeling.

As shown below, Transolver can precisely capture miscellaneous physical states of PDEs, such as (a) various fluid-structure interactions in a Darcy flow, (b) different extrusion regions of elastic materials, (c) shock wave and wake flow around the airfoil, (d) front-back surfaces and up-bottom spaces of driving cars.



Figure 2. Visualization of learned physical states.

Get Started

  1. Please refer to different folders for detailed experiment instructions.

  2. List of experiments:

Results

Transolver achieves consistent state-of-the-art in six standard benchmarks and two practical design tasks. More than 20 baselines are compared.



Table 1. Results on six standard benchmarks.



Table 2. Results on two design tasks: Car and Airfoild design.

Showcases



Figure 3. Comparison of Transolver and other models.

Citation

If you find this repo useful, please cite our paper.

@inproceedings{wu2024Transolver,
  title={Transolver: A Fast Transformer Solver for PDEs on General Geometries},
  author={Haixu Wu and Huakun Luo and Haowen Wang and Jianmin Wang and Mingsheng Long},
  booktitle={International Conference on Machine Learning},
  year={2024}
}

Contact

If you have any questions or want to use the code, please contact [email protected].

Acknowledgement

We appreciate the following github repos a lot for their valuable code base or datasets:

https://github.com/neuraloperator/neuraloperator

https://github.com/neuraloperator/Geo-FNO

https://github.com/thuml/Latent-Spectral-Models

https://github.com/Extrality/AirfRANS

About

About code release of "Transolver: A Fast Transformer Solver for PDEs on General Geometries", ICML 2024 Spotlight. https://arxiv.org/abs/2402.02366

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published