Back to Search Start Over

The Improved Stochastic Fractional Order Gradient Descent Algorithm

Authors :
Yang Yang
Lipo Mo
Yusen Hu
Fei Long
Source :
Fractal and Fractional, Vol 7, Iss 8, p 631 (2023)
Publication Year :
2023
Publisher :
MDPI AG, 2023.

Abstract

This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are designed combining a fractional order gradient and it is shown that the corresponding regret functions are convergent at a sub-linear rate. Then we discuss the impact of the fractional order on the convergence and monotonicity and prove that the better performance can be obtained by adjusting the order of the fractional gradient. Finally, several practical examples are given to verify the superiority and validity of the proposed algorithm.

Details

Language :
English
ISSN :
25043110
Volume :
7
Issue :
8
Database :
Directory of Open Access Journals
Journal :
Fractal and Fractional
Publication Type :
Academic Journal
Accession number :
edsdoj.033fdc43fade4fe5ac76e7faffec5a71
Document Type :
article
Full Text :
https://doi.org/10.3390/fractalfract7080631