On hyperparameter optimization of machine learning algorithms: Theory and practice

Machine learning algorithms have been used widely in various applications and areas. To fit a machine learning model into different problems, its hyper-parameters must be tuned. Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model’s performanc...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 415; pp. 295 - 316
Main Authors Yang, Li, Shami, Abdallah
Format Journal Article
LanguageEnglish
Published Elsevier B.V 20.11.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Machine learning algorithms have been used widely in various applications and areas. To fit a machine learning model into different problems, its hyper-parameters must be tuned. Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model’s performance. It often requires deep knowledge of machine learning algorithms and appropriate hyper-parameter optimization techniques. Although several automatic optimization techniques exist, they have different strengths and drawbacks when applied to different types of problems. In this paper, optimizing the hyper-parameters of common machine learning models is studied. We introduce several state-of-the-art optimization techniques and discuss how to apply them to machine learning algorithms. Many available libraries and frameworks developed for hyper-parameter optimization problems are provided, and some open challenges of hyper-parameter optimization research are also discussed in this paper. Moreover, experiments are conducted on benchmark datasets to compare the performance of different optimization methods and provide practical examples of hyper-parameter optimization. This survey paper will help industrial users, data analysts, and researchers to better develop machine learning models by identifying the proper hyper-parameter configurations effectively.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2020.07.061