Stochastic Conjugate Gradient Algorithm With Variance Reduction

Conjugate gradient (CG) methods are a class of important methods for solving linear equations and nonlinear optimization problems. In this paper, we propose a new stochastic CG algorithm with variance reduction<xref rid="fn1" ref-type="fn"> 1 and we prove its linear converg...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 30; no. 5; pp. 1360 - 1369
Main Authors Jin, Xiao-Bo, Zhang, Xu-Yao, Huang, Kaizhu, Geng, Guang-Gang
Format Journal Article
LanguageEnglish
Published United States IEEE 01.05.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Conjugate gradient (CG) methods are a class of important methods for solving linear equations and nonlinear optimization problems. In this paper, we propose a new stochastic CG algorithm with variance reduction<xref rid="fn1" ref-type="fn"> 1 and we prove its linear convergence with the Fletcher and Reeves method for strongly convex and smooth functions. We experimentally demonstrate that the CG with variance reduction algorithm converges faster than its counterparts for four learning models, which may be convex, nonconvex or nonsmooth. In addition, its area under the curve performance on six large-scale data sets is comparable to that of the LIBLINEAR solver for the <inline-formula> <tex-math notation="LaTeX">L2 </tex-math></inline-formula>-regularized <inline-formula> <tex-math notation="LaTeX">L2 </tex-math></inline-formula>-loss but with a significant improvement in computational efficiency. 1 CGVR algorithm is available on github: https://github.com/xbjin/cgvr
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2018.2868835