Interval-based Prediction Uncertainty Bound Computation in Learning with Missing Values
The problem of machine learning with missing values is common in many areas. A simple approach is to first construct a dataset without missing values simply by discarding instances with missing entries or by imputing a fixed value for each missing entry, and then train a prediction model with the ne...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
01.03.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The problem of machine learning with missing values is common in many areas.
A simple approach is to first construct a dataset without missing values simply
by discarding instances with missing entries or by imputing a fixed value for
each missing entry, and then train a prediction model with the new dataset. A
drawback of this naive approach is that the uncertainty in the missing entries
is not properly incorporated in the prediction. In order to evaluate prediction
uncertainty, the multiple imputation (MI) approach has been studied, but the
performance of MI is sensitive to the choice of the probabilistic model of the
true values in the missing entries, and the computational cost of MI is high
because multiple models must be trained. In this paper, we propose an
alternative approach called the Interval-based Prediction Uncertainty Bounding
(IPUB) method. The IPUB method represents the uncertainties due to missing
entries as intervals, and efficiently computes the lower and upper bounds of
the prediction results when all possible training sets constructed by imputing
arbitrary values in the intervals are considered. The IPUB method can be
applied to a wide class of convex learning algorithms including penalized
least-squares regression, support vector machine (SVM), and logistic
regression. We demonstrate the advantages of the IPUB method by comparing it
with an existing method in numerical experiment with benchmark datasets. |
---|---|
DOI: | 10.48550/arxiv.1803.00218 |