Application of chaotic Fish School Search optimization algorithm with exponential step decay in neural network loss function optimization

The Fish School Search (FSS) algorithm is a heuristic technique for finding globally optimal solutions. This algorithm is characterized by its simplicity in implementation, and high performance. Since the first mention of FSS, this effective optimization algorithm has been of a great interest among...

Full description

Saved in:
Bibliographic Details
Published inProcedia computer science Vol. 186; pp. 352 - 359
Main Authors Demidova, L.A., Gorchakov, A.V.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The Fish School Search (FSS) algorithm is a heuristic technique for finding globally optimal solutions. This algorithm is characterized by its simplicity in implementation, and high performance. Since the first mention of FSS, this effective optimization algorithm has been of a great interest among researches and practitioners around the globe. Modifications of FSS exist, applied to solve practical problems, including image reconstruction in electrical impedance tomography, finding optimal solutions in assembly line balancing problems, neural network structure optimization. In this paper, we consider a modification of the FSS algorithm, which uses chaos theory to generate uniformly distributed pseudorandom numbers, and incorporates exponential step decay. The described modified optimization algorithm is known as ETFSS, and is characterized by faster convergence speed and better performance. In order to further investigate the performance of the novel optimization algorithm, we apply ETFSS to neural network loss function optimization. In addition, we compare the described approach with other machine learning techniques, such as the support vector machine (SVM) algorithm, k-nearest neighbors (KNN) algorithm and back propagation-based neural network, trained using the adaptive moment estimation (Adam) optimizer. We visualize classification results using T-distributed stochastic neighbor embedding (TSNE) method, and uniform manifold approximation and projection (UMAP) method, in order to provide more details considering classification performance and dataset shape. The obtained results confirm, that ETFSS can produce slightly more accurate classifications when compared to backpropagation.
ISSN:1877-0509
1877-0509
DOI:10.1016/j.procs.2021.04.156