Global Convergence Properties of Conjugate Gradient Methods for Optimization
This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher-Reeves method play an important role...
Saved in:
Published in | SIAM journal on optimization Vol. 2; no. 1; pp. 21 - 42 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Philadelphia
Society for Industrial and Applied Mathematics
01.02.1992
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher-Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak-Ribie re method. Numerical experiments are presented. |
---|---|
ISSN: | 1052-6234 1095-7189 |
DOI: | 10.1137/0802003 |