Back to Search
Start Over
On the benefit of overparameterisation in state reconstruction: An empirical study of the nonlinear case
- Publication Year :
- 2023
-
Abstract
- The empirical success of machine learning models with many more parameters than measurements has generated an interest in the theory of overparameterisation, i.e., underdetermined models. This paradigm has recently been studied in domains such as deep learning, where one is interested in good (local) minima of complex, nonlinear loss functions. Optimisers, like gradient descent, perform well and consistently reach good solutions. Similarly, nonlinear optimisation problems are encountered in the field of system identification. Examples of such high-dimensional problems are optimisation tasks ensuing from the reconstruction of model states and parameters of an assumed known dynamical system from observed time series. In this work, we identify explicit parallels in the benefits of overparameterisation between what has been analysed in the deep learning context and system identification. We test multiple chaotic time series models, analysing the optimisation process for unknown model states and parameters in batch mode. We find that gradient descent reaches better solutions if we assume more parameters to be unknown. We hypothesise that, indeed, overparameterisation leads us towards better minima, and that more degrees of freedom in the optimisation are beneficial so long as the system is, in principle, observable.
- Subjects :
- Mathematics - Optimization and Control
Mathematics - Dynamical Systems
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2304.08066
- Document Type :
- Working Paper