1. Mamba Neural Operator: Who Wins? Transformers vs. State-Space Models for PDEs
- Author
-
Cheng, Chun-Wun, Huang, Jiahao, Zhang, Yi, Yang, Guang, Schönlieb, Carola-Bibiane, and Aviles-Rivero, Angelica I
- Subjects
Computer Science - Machine Learning ,Mathematics - Numerical Analysis - Abstract
Partial differential equations (PDEs) are widely used to model complex physical systems, but solving them efficiently remains a significant challenge. Recently, Transformers have emerged as the preferred architecture for PDEs due to their ability to capture intricate dependencies. However, they struggle with representing continuous dynamics and long-range interactions. To overcome these limitations, we introduce the Mamba Neural Operator (MNO), a novel framework that enhances neural operator-based techniques for solving PDEs. MNO establishes a formal theoretical connection between structured state-space models (SSMs) and neural operators, offering a unified structure that can adapt to diverse architectures, including Transformer-based models. By leveraging the structured design of SSMs, MNO captures long-range dependencies and continuous dynamics more effectively than traditional Transformers. Through extensive analysis, we show that MNO significantly boosts the expressive power and accuracy of neural operators, making it not just a complement but a superior framework for PDE-related tasks, bridging the gap between efficient representation and accurate solution approximation.
- Published
- 2024