Generalized symmetric ADMM for separable convex optimization

The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to so...

Full description

Saved in:
Bibliographic Details
Published inComputational optimization and applications Vol. 70; no. 1; pp. 129 - 170
Main Authors Bai, Jianchao, Li, Jicheng, Xu, Fengmin, Zhang, Hongchao
Format Journal Article
LanguageEnglish
Published New York Springer US 01.05.2018
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where p ≥ 1 and q ≥ 1 are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case O ( 1 / t ) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0926-6003
1573-2894
DOI:10.1007/s10589-017-9971-0