Newton-like methods for numerical optimization on manifolds

Many problems in signal processing require the numerical optimization of a cost function, which is defined on a smooth manifold. Especially, orthogonally or unitarily constrained optimization problems tend to occur in signal processing tasks involving subspaces. In this paper we consider Newton-like...

Full description

Saved in:
Bibliographic Details
Published inConference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers, 2004 Vol. 1; pp. 136 - 139 Vol.1
Main Authors Huper, K., Trumpf, J.
Format Conference Proceeding
LanguageEnglish
Published Piscataway NJ IEEE 2004
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Many problems in signal processing require the numerical optimization of a cost function, which is defined on a smooth manifold. Especially, orthogonally or unitarily constrained optimization problems tend to occur in signal processing tasks involving subspaces. In this paper we consider Newton-like methods for solving these types of problems. Under the assumption that the parameterization of the manifold is linked to so-called Riemannian normal coordinates our algorithms can be considered as intrinsic Newton methods. Moreover, if there is not such a relationship, we still can prove local quadratic convergence to a critical point of the cost function by means of analysis on manifolds. Our approach is demonstrated by a detailed example, i.e., computing the dominant eigenspace of a real symmetric matrix.
ISBN:0780386221
9780780386228
DOI:10.1109/ACSSC.2004.1399106