Skip to content

Boosting the training of physics informed neural networks with transfer learning

Notifications You must be signed in to change notification settings

mariusmerkle/TL-PINNs

Repository files navigation

Boosting the training of physics informed neural networks with transfer learning

Differential equations are ubiquitous in science and engineering and their solutions are crucial for many applications as they can help us to understand various scientific disciplines described by applied mathematics. Unfortunately, differential equations are notoriously hard to solve. For almost all practically relevant differen- tial equations, exact analytical solutions are unknown. Therefore, numerical solvers that yield solutions with accuracy up to a certain degree have been developed for several decades. Despite these tremendous efforts, high-performance computing clusters may still need prohibitively long (weeks to months) to carry out sophisticated numerical simulations. Therefore, the design of new efficient algorithms for differential equations is of primary importance and could have revolutionary effects in both academia and industry.

With the rise of artificial intelligence, artificial neural networks have emerged as one of the most promising data-driven algorithms. Recently, they have been applied to differential equations: physics-informed neural networks have been proposed as a new framework to solve differential equations numerically with artificial neural networks, but computational effort is still a bottleneck.

This bachelor’s thesis outlines a new strategy to obtain numerical results with physics-informed neural networks in fraction of time, while pushing the accuracy to new levels. This boost in time and accuracy is possible through the application of transfer learning: instead of starting from a random initial guess, previously acquired knowledge on similar problems is transferred to a new related task. Empirical results demonstrate that this transfer of knowledge in physics-informed neural networks is beneficial for any two similarly related problems. On top, achieved boosts correlate with the underlying similarity of two problems, resulting in efficiency improvements of up to almost two orders of magnitude while outperforming other approaches in accuracy.

On a larger scale, these findings motivate the establishment of an open-source database of physics-informed neural networks. It will be shown that such a database would be self-reinforcing, i.e. their effectiveness would increase with the number of its entries. Therefore, large databases could potentially pave the road to boost physics-informed neural networks in several applications.

About

Boosting the training of physics informed neural networks with transfer learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages