A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions

To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in Al-Baali (1999, 2002). Since the repeating process can be time consuming, the suitable extra updates need to be selected care...

Full description

Saved in:
Bibliographic Details
Published inJournal of computational and applied mathematics Vol. 351; pp. 14 - 28
Main Authors Vlček, Jan, Lukšan, Ladislav
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.05.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in Al-Baali (1999, 2002). Since the repeating process can be time consuming, the suitable extra updates need to be selected carefully. We show that for the limited-memory variable metric BNS method, matrix updating can be efficiently repeated infinitely many times under some conditions, with only a small increase of the number of arithmetic operations. The limit matrix can be written as a block BFGS update (Vlček and Lukšan, 2018), which can be obtained by solving of some low-order Lyapunov matrix equation. The resulting method can be advantageously combined with methods based on vector corrections for conjugacy, see e.g. Vlček and Lukšan (2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new method.
ISSN:0377-0427
1879-1778
DOI:10.1016/j.cam.2018.10.054