A Bregman-Style Improved ADMM and its Linearized Version in the Nonconvex Setting: Convergence and Rate Analyses

This work explores a family of two-block nonconvex optimization problems subject to linear constraints. We first introduce a simple but universal Bregman-style improved alternating direction method of multipliers (ADMM) based on the iteration framework of ADMM and the Bregman distance. Then, we util...

Full description

Saved in:
Bibliographic Details
Published inJournal of the Operations Research Society of China (Internet) Vol. 12; no. 2; pp. 298 - 340
Main Authors Liu, Peng-Jie, Jian, Jin-Bao, Shao, Hu, Wang, Xiao-Quan, Xu, Jia-Wei, Wu, Xiao-Yu
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2024
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This work explores a family of two-block nonconvex optimization problems subject to linear constraints. We first introduce a simple but universal Bregman-style improved alternating direction method of multipliers (ADMM) based on the iteration framework of ADMM and the Bregman distance. Then, we utilize the smooth performance of one of the components to develop a linearized version of it. Compared to the traditional ADMM, both proposed methods integrate a convex combination strategy into the multiplier update step. For each proposed method, we demonstrate the convergence of the entire iteration sequence to a unique critical point of the augmented Lagrangian function utilizing the powerful Kurdyka–Łojasiewicz property, and we also derive convergence rates for both the sequence of merit function values and the iteration sequence. Finally, some numerical results show that the proposed methods are effective and encouraging for the Lasso model.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2194-668X
2194-6698
DOI:10.1007/s40305-023-00535-8