Back to Search
Start Over
Balanced Neural ODEs: nonlinear model order reduction and Koopman operator approximations
- Publication Year :
- 2024
-
Abstract
- Variational Autoencoders (VAEs) are a powerful framework for learning compact latent representations, while NeuralODEs excel in learning transient system dynamics. This work combines the strengths of both to create fast surrogate models with adjustable complexity. By leveraging the VAE's dimensionality reduction using a non-hierarchical prior, our method adaptively assigns stochastic noise, naturally complementing known NeuralODE training enhancements and enabling probabilistic time series modeling. We show that standard Latent ODEs struggle with dimensionality reduction in systems with time-varying inputs. Our approach mitigates this by continuously propagating variational parameters through time, establishing fixed information channels in latent space. This results in a flexible and robust method that can learn different system complexities, e.g. deep neural networks or linear matrices. Hereby, it enables efficient approximation of the Koopman operator without the need for predefining its dimensionality. As our method balances dimensionality reduction and reconstruction accuracy, we call it Balanced Neural ODE (B-NODE). We demonstrate the effectiveness of this method on academic test cases and apply it to a real-world example of a thermal power plant.<br />Comment: Conference paper under review
- Subjects :
- Computer Science - Machine Learning
Statistics - Machine Learning
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2410.10174
- Document Type :
- Working Paper