Essential Review Papers on Physics-Informed Neural Networks: A Curated Guide for Practitioners

-

Staying on top of a fast-growing research field is rarely easy.

I face this challenge firsthand as a practitioner in Physics-Informed Neural Networks (PINNs). Latest papers, be they algorithmic advancements or cutting-edge applications, are published at an accelerating pace by each academia and industry. While it’s exciting to see this rapid development, it inevitably raises a pressing query:

That is where I even have found review papers to be exceptionally precious. Good review papers are effective tools that distill essential insights and highlight necessary trends. They’re big-time savers guiding us through the flood of data.

On this blog post, I would love to share with you my personal, curated list of must-read review papers on PINNs, which can be especially influential for my very own understanding and use of PINNs. Those papers cover key facets of PINNs, including algorithmic developments, implementation best practices, and real-world applications.

Along with what’s available in existing literature, I’ve included one in all my very own review papers, which provides a comprehensive evaluation of common functional usage patterns of PINNs — a practical perspective often missing from academic reviews. This evaluation relies on my review of around 200 arXiv papers on PINNs across various engineering domains previously 3 years and may function a vital guide for practitioners trying to deploy these techniques to tackle real-world challenges.

For every review paper, I’ll explain why it deserves your attention by explaining its unique perspective and indicating practical takeaways that you could profit from immediately.

Whether you’re just getting began with PINNs, using them to tackle real-world problems, or exploring latest research directions, I hope this collection makes navigating the busy field of PINN research easier for you.

Let’s cut through the complexity together and concentrate on what truly matters.

1️⃣ Scientific Machine Learning through Physics-Informed Neural Networks: Where we’re and what’s next

📄 Paper at a look

🔍 What it covers

  • Authors: S. Cuomo, V. Schiano di Cola, F. Giampaolo, G. Rozza, M. Raissi, and F. Piccialli
  • Yr: 2022
  • Link: arXiv

This review is structured around key themes in PINNs: the elemental components that outline their architecture, theoretical facets of their learning process, and their application to varied computing challenges in engineering. The paper also explores the available toolsets, emerging trends, and future directions.

Fig 1. Overview of the #1 review paper. (Image by creator)

✨ What’s unique

This review paper stands out in the next ways:

  • Among the best introductions to PINN fundamentals. This paper takes a well-paced approach to explaining PINNs from the bottom up. systematically dissects the constructing blocks of a PINN, covering various underlying neural network architectures and their associated characteristics, how PDE constraints are incorporated, common training methodologies, and learning theory (convergence, error evaluation, etc.) of PINNs.
  • Putting PINNs in historical context. Fairly than simply presenting PINNs as a standalone solution, the paper traces their development from earlier work on using deep learning to resolve differential equations. This historical framing is precious since it helps demystify PINNs by showing that they’re an evolution of previous ideas, and it makes it easier for practitioners to see what alternatives can be found.
  • Equation-driven organization. As an alternative of just classifying PINN research by scientific domains (e.g., geoscience, material science, etc.) as many other reviews do, this paper categorizes PINNs based on the forms of differential equations (e.g., diffusion problems, advection problems, etc.) they solve. This encourages knowledge transfer as the identical set of PDEs may very well be used across multiple scientific domains. As well as, it makes it easier for practitioners to see the strengths and weaknesses of PINNs when coping with several types of differential equations.

🛠 Practical goodies

Beyond its theoretical insights, this review paper offers immediately useful resources for practitioners:

  • An entire implementation example. In , this paper walks through a full PINN implementation to resolve a 1D Nonlinear Schrödinger equation. It covers translating equations into PINN formulations, handling boundary and initial conditions, defining neural network architectures, selecting training strategies, choosing collocation points, and applying optimization methods. All implementation details are clearly documented for straightforward reproducibility. The paper compares PINN performance by various different hyperparameters, which could offer immediately applicable insights for your personal PINN experiments.
  • Available frameworks and software tools. compiles a comprehensive list of major PINN toolkits, with detailed tool descriptions provided in . The considered backends include not only Tensorflow and PyTorch but additionally Julia and Jax. This side-by-side comparison of various frameworks is very useful for selecting the suitable tool on your needs.

💡Who would profit

  • This review paper advantages anyone latest to PINNs and searching for a transparent, structured introduction.
  • Engineers and developers searching for practical implementation guidance would find the realistic, hands-on demo, and the thorough comparison of existing PINN frameworks most interesting. Moreover, they’ll find relevant prior work on differential equations much like their current problem, which offers insights they’ll leverage in their very own problem-solving.
  • Researchers investigating theoretical facets of PINN convergence, optimization, or efficiency may greatly profit from this paper.

2️⃣ From PINNs to PIKANs: Recent Advances in Physics-Informed Machine Learning

📄 Paper at a look

  • Authors: J. D. Toscano, V. Oommen, A. J. Varghese, Z. Zou, N. A. Daryakenari, C. Wu, and G. E. Karniadakis
  • Yr: 2024
  • Link: arXiv

🔍 What it covers

This paper provides one in all the overviews of the most recent advancements in PINNs. It emphasises enhancements in network design, feature expansion, optimization strategies, uncertainty quantification, and theoretical insights. The paper also surveys key applications across a variety of domains.

Fig 2. Overview of the #2 review paper. (Image by creator)

✨ What’s unique

This review paper stands out in the next ways:

  • A structured taxonomy of algorithmic developments. One of the fresh contributions of this paper is its taxonomy of algorithmic advancements. This latest taxonomy scheme elegantly categorizes all of the advancements into three core areas: (1) , (2) , and (3) . This structure provides a transparent framework for understanding each current developments and potential directions for future research. As well as, the illustrations utilized in the paper are top-notch and simply digestible.
Fig 3. The taxonomy of algorithmic developments in PINNs proposed by the #2 paper. (Image by creator)
  • Highlight on Physics-informed Kolmogorov–Arnold Networks (KAN). KAN, a brand new architecture based on the Kolmogorov–Arnold representation theorem, is currently a hot topic in deep learning. Within the PINN community, some work has already been done to exchange the multilayer perceptions (MLP) representation with KANs to achieve more expressiveness and training efficiency. The community lacks a comprehensive review of this latest line of research. This review paper () exactly fills within the gap.
  • Review on uncertainty quantification (UQ) in PINNs. UQ is important for the reliable and trustworthy deployment of PINNs when tackling real-world engineering applications. In , this paper provides a dedicated section on UQ, explaining the common sources of uncertainty in solving differential equations with PINNs and reviewing strategies for quantifying prediction confidence.
  • Theoretical advances in PINN training dynamics. In practice, training PINNs is non-trivial. Practitioners are sometimes puzzled by why PINNs training sometimes fail, or how they ought to be trained optimally. In , this paper provides some of the detailed and up-to-date discussions on this aspect, covering the Neural Tangent Kernel (NTK) evaluation of PINNs, information bottleneck theory, and multi-objective optimization challenges.

🛠 Practical goodies

Though this review paper leans towards the theory-heavy side, two particularly precious facets stand out from a practical perspective:

  • A timeline of algorithmic advances in PINNs. In , this paper tracks the milestones of key advancements in PINNs, from the unique PINN formulation to essentially the most recent extensions to KANs. In case you’re working on algorithmic improvements, this timeline gives you a transparent view of what’s already been done. In case you’re fighting PINN training or accuracy, you should utilize this table to seek out existing methods which may solve your issue.
  • A broad overview of PINN applications across domains. In comparison with all the opposite reviews, this paper strives to present the of PINN applications in not only the engineering domains but additionally other less-covered fields corresponding to finance. Practitioners can easily find prior works conducted of their domains and draw inspiration.

💡Who would profit

  • For practitioners working in safety-critical fields that need confidence intervals or reliability estimates on their PINN predictions, the discussion on UQ could be useful. In case you are fighting PINN training instability, slow convergence, or unexpected failures, the discussion on PINN training dynamics will help unpack the theoretical reasons behind these issues.
  • Researchers may find this paper especially interesting due to the latest taxonomy, which allows them to see patterns and discover gaps and opportunities for novel contributions. As well as, the review of cutting-edge work on PI-KAN can be inspiring.

3️⃣ Physics-Informed Neural Networks: An Application-Centric Guide

📄 Paper at a look

  • Authors: S. Guo (this creator)
  • Yr: 2024
  • Link: Medium

🔍 What it covers

This text reviews how PINNs are used to tackle several types of engineering tasks. For every task category, the article discusses the issue statement, why PINNs are useful, how PINNs will be implemented to handle the issue, and is followed by a concrete use case published within the literature.

Fig 4. Overview of the #3 review paper. (Image by creator)

✨ What’s unique

Unlike most reviews that categorize PINN applications either based on the sort of differential equations solved or specific engineering domains, this text picks an angle that practitioners care about essentially the most: the engineering tasks solved by PINNs. This work relies on reviewing papers on PINN case studies scattered in various engineering domains. The final result is a listing of distilled recurring functional usage patterns of PINNs:

  • Predictive modeling and simulations, where PINNs are leveraged for dynamical system forecasting, coupled system modeling, and surrogate modeling.
  • Optimization, where PINNs are commonly employed to attain efficient design optimization, inverse design, model predictive control, and optimized sensor placement.
  • Data-driven insights, where PINNs are used to discover the unknown parameters or functional types of the system, in addition to to assimilate observational data to raised estimate the system states.
  • Data-driven enhancement, where PINNs are used to reconstruct the sphere and enhance the resolution of the observational data.
  • Monitoring, diagnostic, and health assessment, where PINNs are leveraged to act as virtual sensors, anomaly detectors, health monitors, and predictive maintainers.

🛠 Practical goodies

This text places practitioners’ needs on the forefront. While most existing review papers merely answer the query, “”, practitioners often seek more specific guidance: “”. That is precisely what this text tries to handle.

Through the use of the proposed five-category functional classification, practitioners can conveniently map their problems to those categories, see how others have solved them, and what worked and what didn’t. As an alternative of reinventing the wheel, practitioners can leverage established use cases and adapt proven solutions to their very own problems.

💡Who would profit

This review is best for practitioners who wish to see how PINNs are literally getting used in the true world. It will possibly even be particularly precious for cross-disciplinary innovation, as practitioners can learn from solutions developed in other fields.

4️⃣ An Expert’s Guide to Training Physics-informed Neural Networks

📄 Paper at a look

  • Authors: S. Wang, S. Sankaran, H. Wang, P. Perdikaris
  • Yr: 2023
  • Link: arXiv

🔍 What it covers

Though it doesn’t market itself as a “standard” review, this paper goes all in on providing a comprehensive handbook for training PINNs. It presents an in depth set of best practices for training physics-informed neural networks (PINNs), addressing issues like spectral bias, unbalanced loss terms, and causality violations. It also introduces difficult benchmarks and extensive ablation studies to show these methods.

Fig 5. Overview of the #4 review paper. (Image by creator)

✨ What’s unique

  • A unified “expert’s guide”. The principal authors are lively researchers in PINNs, working extensively on improving PINN training efficiency and model accuracy for the past years. This paper is a distilled summary of the authors’ past work, synthesizing a broad range of recent PINN techniques (e.g., Fourier feature embeddings, adaptive loss weighting, causal training) right into a cohesive training pipeline. This appears like having a mentor who tells you exactly what does and doesn’t work with PINNs.
  • A radical hyperparameter tuning study. This paper conducts various experiments to indicate how different tweaks (e.g., different architectures, training schemes, etc.) play out on different PDE tasks. Their ablation studies show precisely which methods move the needle, and by how much.
  • PDE benchmarks. The paper compiles a set of difficult PDE benchmarks and offers state-of-the-art results that PINNs can achieve.

🛠 Practical goodies

  • An issue-solution cheat sheet. This paper thoroughly documents various techniques addressing common PINN training pain-points. Each technique is clearly presented using a structured format: the why (motivation), how (how the approach addresses the issue), and what (the implementation details). This makes it very easy for practitioners to discover the “cure” based on the “symptoms” observed of their PINN training process. What’s great is that the authors transparently discussed potential pitfalls of every approach, allowing practitioners to make well-informed decisions and effective trade-offs.
  • Empirical insights. The paper shares precious empirical insights obtained from extensive hyperparameter tuning experiments. It offers practical guidance on selecting suitable hyperparameters, e.g., network architectures and learning rate schedules, and demonstrates how these parameters interact with the advanced PINN training techniques proposed.
  • Ready-to-use library. The paper is accompanied by an optimized JAX library that practitioners can directly adopt or customize. The library supports multi-GPU environments and is prepared for scaling to large-scale problems.

💡Who would profit

  • Practitioners who’re fighting unstable or slow PINN training can find many practical strategies to repair common pathologies. They may profit from the easy templates (in JAX) to quickly adapt PINNs to their very own PDE setups.
  • Researchers searching for difficult benchmark problems and aiming to benchmark latest PINN ideas against well-documented baselines will find this paper especially handy.

5️⃣ Domain-Specific Review Papers

Beyond general reviews in PINNs, there are several nice review papers that concentrate on specific scientific and engineering domains. In case you’re working in one in all these fields, these reviews could provide a deeper dive into best practices and cutting-edge applications.

1. Heat Transfer Problems

Paper: Physics-Informed Neural Networks for Heat Transfer Problems

The paper provides an application-centric discussion on how PINNs will be used to tackle various thermal engineering problems, including inverse heat transfer, convection-dominated flows, and phase-change modeling. It highlights real-world challenges corresponding to missing boundary conditions, sensor-driven inverse problems, and adaptive cooling system design. The economic case study related to power electronics is especially insightful for understanding the usage of PINNs in practice.

2. Power Systems

Paper: Applications of Physics-Informed Neural Networks in Power Systems — A Review

This paper offers a structured overview of how PINNs are applied to critical power grid challenges, including state/parameter estimation, dynamic evaluation, power flow calculation, optimal power flow (OPF), anomaly detection, and model synthesis. For every sort of application, the paper discusses the shortcomings of traditional power system solutions and explains why PINNs may very well be advantageous in addressing those shortcomings. This comparative summary is beneficial for understanding the motivation for adopting PINNs.

3. Fluid Mechanics

Paper: Physics-informed neural networks (PINNs) for fluid mechanics: A review

This paper explored three detailed case studies that show PINNs application in fluid dynamics: (1) 3D wake flow reconstruction using sparse 2D velocity data, (2) inverse problems in compressible flow (e.g., shock wave prediction with minimal boundary data), and (3) biomedical flow modeling, where PINNs infer thrombus material properties from phase-field data. The paper highlights how PINNs overcome limitations in traditional CFD, e.g., mesh dependency, expensive data assimilation, and difficulty handling ill-posed inverse problems.

4. Additive Manufacturing

Paper: A review on physics-informed machine learning for monitoring metal additive manufacturing process

This paper examines how PINNs address critical challenges specific to additive manufacturing process prediction or monitoring, including temperature field prediction, fluid dynamics modeling, fatigue life estimation, accelerated finite element simulations, and process characteristics prediction.

6️⃣ Conclusion

On this blog post, we went through a curated list of review papers on PINNs, covering fundamental theoretical insights, the most recent algorithmic advancements, and practical application-oriented perspectives. For every paper, we highlighted unique contributions, key takeaways, and the audience that may profit essentially the most from these insights. I hope this curated collection can enable you to higher navigate the evolving field of PINNs.

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x