Back to Search
Start Over
Scalable algorithms for physics-informed neural and graph networks.
- Source :
- Data-Centric Engineering; 2022, Vol. 3, p1-26, 26p
- Publication Year :
- 2022
-
Abstract
- Physics-informed machine learning (PIML) has emerged as a promising new approach for simulating complex physical and biological systems that are governed by complex multiscale processes for which some data are also available. In some instances, the objective is to discover part of the hidden physics from the available data, and PIML has been shown to be particularly effective for such problems for which conventional methods may fail. Unlike commercial machine learning where training of deep neural networks requires big data, in PIML big data are not available. Instead, we can train such networks from additional information obtained by employing the physical laws and evaluating them at random points in the space-time domain. Such PIML integrates multimodality and multifidelity data with mathematical models, and implements them using neural networks or graph networks. Here, we review some of the prevailing trends in embedding physics into machine learning, using physics-informed neural networks (PINNs) based primarily on feed-forward neural networks and automatic differentiation. For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs). We present representative examples for both forward and inverse problems and discuss what advances are needed to scale up PINNs, PIGNs and more broadly GNNs for large-scale engineering problems. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- Volume :
- 3
- Database :
- Complementary Index
- Journal :
- Data-Centric Engineering
- Publication Type :
- Academic Journal
- Accession number :
- 176421056
- Full Text :
- https://doi.org/10.1017/dce.2022.24