On the convergence of a linesearch based proximal-gradient method for nonconvex optimization

We consider a variable metric linesearch based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a critical point if the objective function satisfies the Kurdyk...

Full description

Saved in:
Bibliographic Details
Published inInverse problems Vol. 33; no. 5; pp. 55005 - 55034
Main Authors Bonettini, S, Loris, I, Porta, F, Prato, M, Rebegoldi, S
Format Journal Article
LanguageEnglish
Published IOP Publishing 01.05.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We consider a variable metric linesearch based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a critical point if the objective function satisfies the Kurdyka- ojasiewicz property at each point of its domain, under the assumption that a limit point exists. The proposed method is applied to a wide collection of image processing problems and our numerical tests show that our algorithm results to be flexible, robust and competitive when compared to recently proposed approaches able to address the optimization problems arising in the considered applications.
Bibliography:IP-101028.R2
ISSN:0266-5611
1361-6420
DOI:10.1088/1361-6420/aa5bfd