Deep learning for missing value imputation of continuous data and the effect of data discretization

Often real-world datasets are incomplete and contain some missing attribute values. Furthermore, many data mining and machine learning techniques cannot directly handle incomplete datasets. Missing value imputation is the major solution for constructing a learning model to estimate specific values t...

Full description

Saved in:
Bibliographic Details
Published inKnowledge-based systems Vol. 239; p. 108079
Main Authors Lin, Wei-Chao, Tsai, Chih-Fong, Zhong, Jia Rong
Format Journal Article
LanguageEnglish
Published Amsterdam Elsevier B.V 05.03.2022
Elsevier Science Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Often real-world datasets are incomplete and contain some missing attribute values. Furthermore, many data mining and machine learning techniques cannot directly handle incomplete datasets. Missing value imputation is the major solution for constructing a learning model to estimate specific values to replace the missing ones. Deep learning techniques have been employed for missing value imputation and demonstrated their superiority over many other well-known imputation methods. However, very few studies have attempted to assess the imputation performance of deep learning techniques for tabular or structured data with continuous values. Moreover, the effect on the imputation results when the continuous data need to be discretized has never been examined. In this paper, two supervised deep neural networks, i.e., multilayer perceptron (MLP) and deep belief networks (DBN), are compared for missing value imputation. Moreover, two differently ordered combinations of data discretization and imputation steps are examined. The results show that MLP and DBN significantly outperform the baseline imputation methods based on the mean, KNN, CART, and SVM, with DBN performing the best. On the other hand, when considering the discretization of continuous data, the order in which the two steps are combined is not the most important, but rather, the chosen imputation algorithm. That is, the final performance is much better when using DBN for imputation, regardless of whether discretization is performed in the first or second step, than the other imputation methods. •Deep learning for imputing missing continuous values of tabular or structured data is studied.•In particular, multilayer perceptron (MLP) and deep belief networks (DBN) are employed.•Two different ordered combinations of data discretization and imputation steps are examined.•MLP and DBN significantly outperform the baseline imputation methods.•DBN is the better choice for imputation when the discretization of continuous data is required.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2021.108079