A Training-Free Neural Architecture Search Algorithm Based on Search Economics

Motivated by the observation that most neural architecture search (NAS) methods are time consuming because a "training process" is required to evaluate each searched neural architecture, this article presents an efficient NAS algorithm based on a promising metaheuristic algorithm named sea...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on evolutionary computation Vol. 28; no. 2; pp. 445 - 459
Main Authors Wu, Meng-Ting, Lin, Hung-I, Tsai, Chun-Wei
Format Journal Article
LanguageEnglish
Published New York IEEE 01.04.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Motivated by the observation that most neural architecture search (NAS) methods are time consuming because a "training process" is required to evaluate each searched neural architecture, this article presents an efficient NAS algorithm based on a promising metaheuristic algorithm named search economics (SE) and a new training-free estimator to evaluate the searched neural architectures for not only obtaining a good neural architecture but also accelerating the computation time. The basic idea of the proposed NAS algorithm is to use the so-called expected value of each region in the search space to guide the search so that it will focus on searching high potential regions instead of solutions with high objective values in particular regions. To evaluate the performance of the proposed algorithm, we compare it with state-of-the-art nontraining-free and training-free NAS methods. Experimental results show that the proposed algorithm is capable of finding a result that is similar to or better than those found by most nontraining-free NAS algorithms compared in this study but taking only a tiny portion of the computation time.
ISSN:1089-778X
1941-0026
DOI:10.1109/TEVC.2023.3264533