On the Convergence of Block Coordinate Descent Type Methods

In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain bloc...

Full description

Saved in:
Bibliographic Details
Published inSIAM journal on optimization Vol. 23; no. 4; pp. 2037 - 2060
Main Authors Beck, Amir, Tetruashvili, Luba
Format Journal Article
LanguageEnglish
Published Philadelphia Society for Industrial and Applied Mathematics 01.01.2013
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order. Global sublinear rate of convergence of this method is established and it is shown that it can be accelerated when the problem is unconstrained. In the unconstrained setting we also prove a sublinear rate of convergence result for the so-called alternating minimization method when the number of blocks is two. When the objective function is also assumed to be strongly convex, linear rate of convergence is established. [PUBLICATION ABSTRACT]
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1052-6234
1095-7189
DOI:10.1137/120887679