AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data

Developing high-performing predictive models for large tabular data sets is a challenging task. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural networks with different architectures concurrently to automatically discover an high performing model. A...

Full description

Saved in:
Bibliographic Details
Published inSC21: International Conference for High Performance Computing, Networking, Storage and Analysis pp. 1 - 16
Main Authors Egele, Romain, Balaprakash, Prasanna, Guyon, Isabelle, Vishwanath, Venkatram, Xia, Fangfang, Stevens, Rick, Liu, Zhengying
Format Conference Proceeding
LanguageEnglish
Published ACM 14.11.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Developing high-performing predictive models for large tabular data sets is a challenging task. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural networks with different architectures concurrently to automatically discover an high performing model. A key issue in NAS, particularly for large data sets, is the large computation time required to evaluate each generated architecture. While data-parallel training has the potential to address this issue, a straightforward approach can result in significant loss of accuracy. To that end, we develop AgEBO-Tabular, which combines Aging Evolution (AE) to search over neural architectures and asynchronous Bayesian optimization (BO) to search over hyperparameters to adapt data-parallel training. We evaluate the efficacy of our approach on two large predictive modeling tabular data sets from the Exascale Computing Project-CANcer Distributed Learning Environment (ECP-CANDLE).
ISSN:2167-4337
DOI:10.1145/3458817.3476203