Adaptive Gradient-type Methods for Convex Optimization Problems with Relative Accuracy and Sharp Minimum

In this paper, we consider gradient-type methods for convex positively homogeneous optimization problems with relative accuracy. An analogue of the accelerated universal gradient-type method for positively homogeneous optimization problems with relative accuracy is investigated. The second approach...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Stonyakin, Fedor S, Ablaev, Seydamet S, Baran, Inna V
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 12.12.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we consider gradient-type methods for convex positively homogeneous optimization problems with relative accuracy. An analogue of the accelerated universal gradient-type method for positively homogeneous optimization problems with relative accuracy is investigated. The second approach is related to subgradient methods with B. T. Polyak stepsize. Result on the linear convergence rate for some methods of this type with adaptive step adjustment is obtained for some class of non-smooth problems. Some generalization to a special class of non-convex non-smooth problems is also considered.
ISSN:2331-8422
DOI:10.48550/arxiv.2103.17159