Global Convergence Properties of Conjugate Gradient Methods for Optimization

This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher-Reeves method play an important role...

Full description

Saved in:
Bibliographic Details
Published inSIAM journal on optimization Vol. 2; no. 1; pp. 21 - 42
Main Authors Gilbert, Jean Charles, Nocedal, Jorge
Format Journal Article
LanguageEnglish
Published Philadelphia Society for Industrial and Applied Mathematics 01.02.1992
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher-Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak-Ribie re method. Numerical experiments are presented.
ISSN:1052-6234
1095-7189
DOI:10.1137/0802003