AnytimeNet: Controlling Time-Quality Tradeoffs in Deep Neural Network Architectures
Deeper neural networks, especially those with extremely large numbers of internal parameters, impose a heavy computational burden in obtaining sufficiently high-quality results. These burdens are impeding the application of machine learning and related techniques to time-critical computing systems....
Saved in:
Published in | 2020 Design, Automation & Test in Europe Conference & Exhibition (DATE) pp. 945 - 950 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
EDAA
01.03.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Deeper neural networks, especially those with extremely large numbers of internal parameters, impose a heavy computational burden in obtaining sufficiently high-quality results. These burdens are impeding the application of machine learning and related techniques to time-critical computing systems. To address this challenge, we are proposing an architectural approach for neural networks that adaptively trades off computation time and solution quality to achieve high-quality solutions with timeliness. We propose a novel and general framework, AnytimeNet, that gradually inserts additional layers, so users can expect monotonically increasing quality of solutions as more computation time is expended. The framework allows users to select on the fly when to retrieve a result during runtime. Extensive evaluation results on classification tasks demonstrate that our proposed architecture provides adaptive control of classification solution quality according to the available computation time. |
---|---|
ISSN: | 1558-1101 |
DOI: | 10.23919/DATE48585.2020.9116280 |