Variance Reduction Techniques for Stochastic Proximal Point Algorithms

In the context of finite sums minimization, variance reduction techniques are widely used to improve the performance of state-of-the-art stochastic gradient methods. Their practical impact is clear, as well as their theoretical properties. Stochastic proximal point algorithms have been studied as an...

Full description

Saved in:
Bibliographic Details
Published inJournal of optimization theory and applications Vol. 203; no. 2; pp. 1910 - 1939
Main Authors Traoré, Cheik, Apidopoulos, Vassilis, Salzo, Saverio, Villa, Silvia
Format Journal Article
LanguageEnglish
Published New York Springer US 01.11.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In the context of finite sums minimization, variance reduction techniques are widely used to improve the performance of state-of-the-art stochastic gradient methods. Their practical impact is clear, as well as their theoretical properties. Stochastic proximal point algorithms have been studied as an alternative to stochastic gradient algorithms since they are more stable with respect to the choice of the step size. However, their variance-reduced versions are not as well studied as the gradient ones. In this work, we propose the first unified study of variance reduction techniques for stochastic proximal point algorithms. We introduce a generic stochastic proximal-based algorithm that can be specified to give the proximal version of SVRG, SAGA, and some of their variants. For this algorithm, in the smooth setting, we provide several convergence rates for the iterates and the objective function values, which are faster than those of the vanilla stochastic proximal point algorithm. More specifically, for convex functions, we prove a sublinear convergence rate of O (1/ k ). In addition, under the Polyak-łojasiewicz condition, we obtain linear convergence rates. Finally, our numerical experiments demonstrate the advantages of the proximal variance reduction methods over their gradient counterparts in terms of the stability with respect to the choice of the step size in most cases, especially for difficult problems.
ISSN:0022-3239
1573-2878
DOI:10.1007/s10957-024-02502-6