Hybrid methods for numerical optimization problems

This presentation introduces hybrid optimization algorithms, which combine evolutionary algorithms(EA) and the gradient search technique, for optimization with continuous parameters. Inheriting the advantages of the two approaches, the new methods are fast and capable of global search. The key featu...

Full description

Saved in:
Bibliographic Details
Published in2008 SICE Annual Conference pp. 17 - 18
Main Authors Min-Jea Tahk, Yoshito Ohta
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.08.2008
Online AccessGet full text

Cover

Loading…
More Information
Summary:This presentation introduces hybrid optimization algorithms, which combine evolutionary algorithms(EA) and the gradient search technique, for optimization with continuous parameters. Inheriting the advantages of the two approaches, the new methods are fast and capable of global search. The key feature of the proposed hybrid methods is that gradient search becomes effective only when the solution region is found and local search with fast convergence is needed. Thus, transition from global search to local search is accomplished implicitly and automatically. The main structure of the new method is similar to that of EA but a special individual called gradient individual is introduced and EA individuals are located symmetrically. The gradient individual is propagated through generations by means of the quasi-Newton method modified for hybrid optimization. Gradient information is calculated from the costs of EA individuals produced by evolution strategy(ES) so that no extra computational burden is required. For estimation of the inverse Hessian matrix, two approaches have been studied: The first approach is based on the classical quasi-Newton algorithms, among which Symmetric Rank-1(SRO) update shows better performance than BFGS and DFP. The second approach is to estimate the Hessian matrix directly from the fitness values of the evolutionary population, rather than using estimates of the gradient vector. Two new algorithms developed for the second approach exhibit more stable Hessian estimation than the quasi-Newton algorithms. Numerical test on various benchmark problems demonstrate that the new hybrid algorithms give faster convergence rate than EA, without sacrificing capability of global search.
ISBN:4907764308
9784907764302
DOI:10.1109/SICE.2008.4654606