A class of conjugate gradient methods for convex constrained monotone equations

The recent designed non-linear conjugate gradient method of Dai and Kou [SIAM J Optim. 2013;23:296-320] is very efficient currently in solving large-scale unconstrained minimization problems due to its simpler iterative form, lower storage requirement and its closeness to the scaled memoryless BFGS...

Full description

Saved in:
Bibliographic Details
Published inOptimization Vol. 66; no. 12; pp. 2309 - 2328
Main Authors Ding, Yanyun, Xiao, Yunhai, Li, Jianwei
Format Journal Article
LanguageEnglish
Published Philadelphia Taylor & Francis 02.12.2017
Taylor & Francis LLC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The recent designed non-linear conjugate gradient method of Dai and Kou [SIAM J Optim. 2013;23:296-320] is very efficient currently in solving large-scale unconstrained minimization problems due to its simpler iterative form, lower storage requirement and its closeness to the scaled memoryless BFGS method. Just because of these attractive properties, this method was extended successfully to solve higher dimensional symmetric non-linear equations in recent years. Nevertheless, its numerical performance in solving convex constrained monotone equations has never been explored. In this paper, combining with the projection method of Solodov and Svaiter, we develop a family of non-linear conjugate gradient methods for convex constrained monotone equations. The proposed methods do not require the Jacobian information of equations, and even they do not store any matrix in each iteration. They are potential to solve non-smooth problems with higher dimensions. We prove the global convergence of the class of the proposed methods and establish its R-linear convergence rate under some reasonable conditions. Finally, we also do some numerical experiments to show that the proposed methods are efficient and promising.
ISSN:0233-1934
1029-4945
DOI:10.1080/02331934.2017.1372438