Algorithms for quasiconvex minimization

In this article we propose two algorithms for minimization of quasiconvex functions. The first one is of type subgradient. In the second one, we consider the steepest descent method with Armijo's rule. In both, we use elements from Plastria's lower subdifferential. Under certain conditions...

Full description

Saved in:
Bibliographic Details
Published inOptimization Vol. 60; no. 8-9; pp. 1105 - 1117
Main Authors Neto, J.X. Da Cruz, Lopes, J.O., Travaglia, M.V.
Format Journal Article
LanguageEnglish
Published Philadelphia Taylor & Francis Group 01.08.2011
Taylor & Francis LLC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this article we propose two algorithms for minimization of quasiconvex functions. The first one is of type subgradient. In the second one, we consider the steepest descent method with Armijo's rule. In both, we use elements from Plastria's lower subdifferential. Under certain conditions, we prove that the sequence generated by these algorithms globally converges to a solution. We provide a counter-example showing that the choice of the minus gradient direction does not assure the global convergence of the descent method to a solution. This counter-example is related to a mistake in the proof of the Theorem 3.1 of J.P. Dussault, [Convergence of implementable descent algorithms for unconstrained optimization (technical note), J. Optim. Theory Appl. 104 (2000), pp. 739-745]. We also point out the mistake in the proof of that theorem.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0233-1934
1029-4945
DOI:10.1080/02331934.2010.528760