Adaptive Gradient-type Methods for Convex Optimization Problems with Relative Accuracy and Sharp Minimum
In this paper, we consider gradient-type methods for convex positively homogeneous optimization problems with relative accuracy. An analogue of the accelerated universal gradient-type method for positively homogeneous optimization problems with relative accuracy is investigated. The second approach...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , , |
Format | Paper Journal Article |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
12.12.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper, we consider gradient-type methods for convex positively homogeneous optimization problems with relative accuracy. An analogue of the accelerated universal gradient-type method for positively homogeneous optimization problems with relative accuracy is investigated. The second approach is related to subgradient methods with B. T. Polyak stepsize. Result on the linear convergence rate for some methods of this type with adaptive step adjustment is obtained for some class of non-smooth problems. Some generalization to a special class of non-convex non-smooth problems is also considered. |
---|---|
ISSN: | 2331-8422 |
DOI: | 10.48550/arxiv.2103.17159 |