Back to Search
Start Over
A novel interpretation of Nesterov's acceleration via variable step-size linear multistep methods
- Publication Year :
- 2024
-
Abstract
- Nesterov's acceleration in continuous optimization can be understood in a novel way when Nesterov's accelerated gradient (NAG) method is considered as a linear multistep (LM) method for gradient flow. Although the NAG method for strongly convex functions (NAG-sc) has been fully discussed, the NAG method for $L$-smooth convex functions (NAG-c) has not. To fill this gap, we show that the existing NAG-c method can be interpreted as a variable step size LM (VLM) for the gradient flow. Surprisingly, the VLM allows linearly increasing step sizes, which explains the acceleration in the convex case. Here, we introduce a novel technique for analyzing the absolute stability of VLMs. Subsequently, we prove that NAG-c is optimal in a certain natural class of VLMs. Finally, we construct a new broader class of VLMs by optimizing the parameters in the VLM for ill-conditioned problems. According to numerical experiments, the proposed method outperforms the NAG-c method in ill-conditioned cases. These results imply that the numerical analysis perspective of the NAG is a promising working environment, and considering a broader class of VLMs could further reveal novel methods.
- Subjects :
- Mathematics - Numerical Analysis
65L06, 65L20, 90C25
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2404.10238
- Document Type :
- Working Paper