New Insights Into Learning With Correntropy-Based Regression
Stemming from information-theoretic learning, the correntropy criterion and its applications to machine learning tasks have been extensively studied and explored. Its application to regression problems leads to the robustness-enhanced regression paradigm: correntropy-based regression. Having drawn a...
Saved in:
Published in | Neural computation Vol. 33; no. 1; pp. 157 - 173 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
One Rogers Street, Cambridge, MA 02142-1209, USA
MIT Press
01.01.2021
MIT Press Journals, The |
Online Access | Get full text |
Cover
Loading…
Summary: | Stemming from information-theoretic learning, the correntropy criterion and its applications to machine learning tasks have been extensively studied and explored. Its application to regression problems leads to the robustness-enhanced regression paradigm: correntropy-based regression. Having drawn a great variety of successful real-world applications, its theoretical properties have also been investigated recently in a series of studies from a statistical learning viewpoint. The resulting big picture is that correntropy-based regression regresses toward the conditional mode function or the conditional mean function robustly under certain conditions. Continuing this trend and going further, in this study, we report some new insights into this problem. First, we show that under the additive noise regression model, such a regression paradigm can be deduced from minimum distance estimation, implying that the resulting estimator is essentially a minimum distance estimator and thus possesses robustness properties. Second, we show that the regression paradigm in fact provides a unified approach to regression problems in that it approaches the conditional mean, the conditional mode, and the conditional median functions under certain conditions. Third, we present some new results when it is used to learn the conditional mean function by developing its error bounds and exponential convergence rates under conditional (
)-moment assumptions. The saturation effect on the established convergence rates, which was observed under (
)-moment assumptions, still occurs, indicating the inherent bias of the regression estimator. These novel insights deepen our understanding of correntropy-based regression, help cement the theoretic correntropy framework, and enable us to investigate learning schemes induced by general bounded nonconvex loss functions. |
---|---|
Bibliography: | January, 2021 ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0899-7667 1530-888X |
DOI: | 10.1162/neco_a_01334 |