Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions
This paper optimizes the step coefficients of first-order methods for smooth convex minimization in terms of the worst-case convergence bound ( i.e. , efficiency) of the decrease in the gradient norm. This work is based on the performance estimation problem approach. The worst-case gradient bound of...
Saved in:
Published in | Journal of optimization theory and applications Vol. 188; no. 1; pp. 192 - 219 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.01.2021
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper optimizes the step coefficients of first-order methods for smooth convex minimization in terms of the worst-case convergence bound (
i.e.
, efficiency) of the decrease in the gradient norm. This work is based on the performance estimation problem approach. The worst-case gradient bound of the resulting method is optimal up to a constant for large-dimensional smooth convex minimization problems, under the initial bounded condition on the cost function value. This paper then illustrates that the proposed method has a computationally efficient form that is similar to the optimized gradient method. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0022-3239 1573-2878 |
DOI: | 10.1007/s10957-020-01770-2 |