The Improved Stochastic Fractional Order Gradient Descent Algorithm

This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are...

Full description

Saved in:
Bibliographic Details
Published inFractal and fractional Vol. 7; no. 8; p. 631
Main Authors Yang, Yang, Mo, Lipo, Hu, Yusen, Long, Fei
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.08.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are designed combining a fractional order gradient and it is shown that the corresponding regret functions are convergent at a sub-linear rate. Then we discuss the impact of the fractional order on the convergence and monotonicity and prove that the better performance can be obtained by adjusting the order of the fractional gradient. Finally, several practical examples are given to verify the superiority and validity of the proposed algorithm.
ISSN:2504-3110
2504-3110
DOI:10.3390/fractalfract7080631