A Combinatorial Approach to Hyperparameter Optimization
In machine learning, hyperparameter optimization (HPO) is essential for effective model training and significantly impacts model performance. Hyperparameters are predefined model settings which fine-tune the model's behavior and are critical to modeling complex data patterns. Traditional HPO ap...
Saved in:
Published in | 2024 IEEE/ACM 3rd International Conference on AI Engineering – Software Engineering for AI (CAIN) pp. 140 - 149 |
---|---|
Main Authors | , , , , |
Format | Conference Proceeding |
Language | English |
Published |
ACM
14.04.2024
|
Subjects | |
Online Access | Get full text |
DOI | 10.1145/3644815.3644941 |
Cover
Summary: | In machine learning, hyperparameter optimization (HPO) is essential for effective model training and significantly impacts model performance. Hyperparameters are predefined model settings which fine-tune the model's behavior and are critical to modeling complex data patterns. Traditional HPO approaches such as Grid Search, Random Search, and Bayesian Optimization have been widely used in this field. However, as datasets grow and models increase in complexity, these approaches often require a significant amount of time and resources for HPO. This research introduces a novel approach using t-way testing-a combinatorial approach to software testing used for identifying faults with a test set that covers all t-way interactions-for HPO. -way testing substantially narrows the search space and effectively covers parameter interactions. Our experimental results show that our approach reduces the number of necessary model evaluations and significantly cuts computational expenses while still outperforming traditional HPO approaches for the models studied in our experiments.CCS CONCEPTS*Computing methodologies → Machine learning;*Software and its engineering → Software maintenance tools. |
---|---|
DOI: | 10.1145/3644815.3644941 |