Error bounds, PL condition, and quadratic growth for weakly convex functions, and linear convergences of proximal point methods

Many practical optimization problems lack strong convexity. Fortunately, recent studies have revealed that first-order algorithms also enjoy linear convergences under various weaker regularity conditions. While the relationship among different conditions for convex and smooth functions is well-under...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Feng-Yi, Liao, Ding, Lijun, Yang, Zheng
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 13.08.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Many practical optimization problems lack strong convexity. Fortunately, recent studies have revealed that first-order algorithms also enjoy linear convergences under various weaker regularity conditions. While the relationship among different conditions for convex and smooth functions is well-understood, it is not the case for the nonsmooth setting. In this paper, we go beyond convexity and smoothness, and clarify the connections among common regularity conditions in the class of weakly convex functions, including \(\textit{strong convexity}\), \(\textit{restricted secant inequality}\), \(\textit{subdifferential error bound}\), \(\textit{Polyak-Łojasiewicz inequality}\), and \(\textit{quadratic growth}\). In addition, using these regularity conditions, we present a simple and modular proof for the linear convergence of the proximal point method (PPM) for convex and weakly convex optimization problems. The linear convergence also holds when the subproblems of PPM are solved inexactly with a proper control of inexactness.
ISSN:2331-8422