A mini-batch algorithm for large-scale learning problems with adaptive step size

The step size selection in stochastic optimization methods is a crucial aspect that holds significant importance in theoretical analysis and practical applications. We propose two stochastic optimization methods based on the competitive Barzilai-Borwein (BB) step size in both the inner and outer loo...

Full description

Saved in:
Bibliographic Details
Published inDigital signal processing Vol. 143; p. 104230
Main Authors He, Chongyang, Zhang, Yiting, Zhu, Dingyu, Cao, Mingyuan, Yang, Yueting
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.11.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The step size selection in stochastic optimization methods is a crucial aspect that holds significant importance in theoretical analysis and practical applications. We propose two stochastic optimization methods based on the competitive Barzilai-Borwein (BB) step size in both the inner and outer loops of the mini-batch semi-stochastic gradient descent (mS2GD) algorithm. The competitive BB step size is automatically updated using the latest and most accurate secant equation. We introduce two algorithms: mS2GD-CBB, which updates the step size in the outer loop, and mS2GD-RCBB, which updates the step size in the inner loop. We evaluate the performance of the proposed algorithms on the classical optimization problem and compare them against existing methods. Experimental results demonstrate that the methods exhibit favorable convergence properties and can effectively handle the challenges of big data arising from the fields of signal processing, statistics, and machine learning. The methods offer enhanced adaptability, flexibility, and exploration capabilities, making them promising approaches for a wide range of optimization problems.
ISSN:1051-2004
1095-4333
DOI:10.1016/j.dsp.2023.104230