AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data
Developing high-performing predictive models for large tabular data sets is a challenging task. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural networks with different architectures concurrently to automatically discover an high performing model. A...
Saved in:
Published in | SC21: International Conference for High Performance Computing, Networking, Storage and Analysis pp. 1 - 16 |
---|---|
Main Authors | , , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
ACM
14.11.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Developing high-performing predictive models for large tabular data sets is a challenging task. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural networks with different architectures concurrently to automatically discover an high performing model. A key issue in NAS, particularly for large data sets, is the large computation time required to evaluate each generated architecture. While data-parallel training has the potential to address this issue, a straightforward approach can result in significant loss of accuracy. To that end, we develop AgEBO-Tabular, which combines Aging Evolution (AE) to search over neural architectures and asynchronous Bayesian optimization (BO) to search over hyperparameters to adapt data-parallel training. We evaluate the efficacy of our approach on two large predictive modeling tabular data sets from the Exascale Computing Project-CANcer Distributed Learning Environment (ECP-CANDLE). |
---|---|
ISSN: | 2167-4337 |
DOI: | 10.1145/3458817.3476203 |