Searching Toward Pareto-Optimal Device-Aware Neural Architectures

Recent breakthroughs in Neural Architectural Search (NAS) have achieved state-of-the-art performance in many tasks such as image classification and language understanding. However, most existing works only optimize for model accuracy and largely ignore other important factors imposed by the underlyi...

Full description

Saved in:
Bibliographic Details
Published in2018 IEEE/ACM International Conference on Computer-Aided Design (ICCAD) pp. 1 - 7
Main Authors Cheng, An-Chieh, Dong, Jin-Dong, Hsu, Chi-Hung, Chang, Shu-Huan, Sun, Min, Chang, Shih-Chieh, Pan, Jia-Yu, Chen, Yu-Ting, Wei, Wei, Juan, Da-Cheng
Format Conference Proceeding
LanguageEnglish
Published ACM 01.11.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recent breakthroughs in Neural Architectural Search (NAS) have achieved state-of-the-art performance in many tasks such as image classification and language understanding. However, most existing works only optimize for model accuracy and largely ignore other important factors imposed by the underlying hardware and devices, such as latency and energy, when making inference. In this paper, we first introduce the problem of NAS and provide a survey on recent works. Then we deep dive into two recent advancements on extending NAS into multiple-objective frameworks: MONAS [19] and DPP-Net [10]. Both MONAS and DPP-Net are capable of optimizing accuracy and other objectives imposed by devices, searching for neural architectures that can be best deployed on a wide spectrum of devices: from embedded systems and mobile devices to workstations. Experimental results are poised to show that architectures found by MONAS and DPP-Net achieves Pareto optimality w.r.t the given objectives for various devices.
ISSN:1558-2434
DOI:10.1145/3240765.3243494