A technical view on neural architecture search

Due to the discovery of innovative and practical neural architectures, deep learning has achieved bright successes in many fields, such as computer vision, natural language processing, recommendation systems, etc. To reach high performance, researchers have to adjust neural architectures and choose...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of machine learning and cybernetics Vol. 11; no. 4; pp. 795 - 811
Main Authors Hu, Yi-Qi, Yu, Yang
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.04.2020
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Due to the discovery of innovative and practical neural architectures, deep learning has achieved bright successes in many fields, such as computer vision, natural language processing, recommendation systems, etc. To reach high performance, researchers have to adjust neural architectures and choose training tricks very carefully. The manual trial-and-error process for discovering the best neural network configuration consumes plenty of manpower. The neural architecture search (NAS) aims to alleviate this issue by automatically configuring neural networks. Recently, the rapid development of NAS has shown significant achievements. Novel neural network architectures that outperform the state-of-the-art handcrafted networks have been discovered in image classification benchmarks. In this paper, we survey NAS from a technical view. By summarizing the previous NAS approaches, we drew a picture of NAS for readers including problem definition, search approaches, progress towards practical applications and possible future directions. We hope that this paper can help beginners start their researches on NAS.
ISSN:1868-8071
1868-808X
DOI:10.1007/s13042-020-01062-1