Stochastic Primal–Dual Hybrid Gradient Algorithm with Adaptive Step Sizes

In this work, we propose a new primal–dual algorithm with adaptive step sizes. The stochastic primal–dual hybrid gradient (SPDHG) algorithm with constant step sizes has become widely applied in large-scale convex optimization across many scientific fields due to its scalability. While the product of...

Full description

Saved in:
Bibliographic Details
Published inJournal of mathematical imaging and vision Vol. 66; no. 3; pp. 294 - 313
Main Authors Chambolle, Antonin, Delplancke, Claire, Ehrhardt, Matthias J., Schönlieb, Carola-Bibiane, Tang, Junqi
Format Journal Article
LanguageEnglish
Published New York Springer US 01.06.2024
Springer Nature B.V
Springer Verlag
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this work, we propose a new primal–dual algorithm with adaptive step sizes. The stochastic primal–dual hybrid gradient (SPDHG) algorithm with constant step sizes has become widely applied in large-scale convex optimization across many scientific fields due to its scalability. While the product of the primal and dual step sizes is subject to an upper-bound in order to ensure convergence, the selection of the ratio of the step sizes is critical in applications. Up-to-now there is no systematic and successful way of selecting the primal and dual step sizes for SPDHG. In this work, we propose a general class of adaptive SPDHG (A-SPDHG) algorithms and prove their convergence under weak assumptions. We also propose concrete parameters-updating strategies which satisfy the assumptions of our theory and thereby lead to convergent algorithms. Numerical examples on computed tomography demonstrate the effectiveness of the proposed schemes.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0924-9907
1573-7683
DOI:10.1007/s10851-024-01174-1