1. Transformers for modeling physical systems
- Author
-
Nicholas Geneva and Nicholas Zabaras
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Dynamical systems theory ,Cognitive Neuroscience ,Physical system ,FOS: Physical sciences ,010103 numerical & computational mathematics ,Dynamical system ,Machine learning ,computer.software_genre ,01 natural sciences ,Field (computer science) ,Machine Learning (cs.LG) ,010305 fluids & plasmas ,Machine Learning ,Artificial Intelligence ,Physical phenomena ,0103 physical sciences ,0101 mathematics ,Representation (mathematics) ,Language ,Natural Language Processing ,Transformer (machine learning model) ,business.industry ,Deep learning ,Computational Physics (physics.comp-ph) ,Artificial intelligence ,business ,Physics - Computational Physics ,computer - Abstract
Transformers are widely used in natural language processing due to their ability to model longer-term dependencies in text. Although these models achieve state-of-the-art performance for many language related tasks, their applicability outside of the natural language processing field has been minimal. In this work, we propose the use of transformer models for the prediction of dynamical systems representative of physical phenomena. The use of Koopman based embeddings provide a unique and powerful method for projecting any dynamical system into a vector representation which can then be predicted by a transformer. The proposed model is able to accurately predict various dynamical systems and outperform classical methods that are commonly used in the scientific machine learning literature., 22 pages, 14 figures, 3 appendices
- Published
- 2022