Analysis of Regression Algorithms with Unbounded Sampling

In this letter, we study a class of the regularized regression algorithms when the sampling process is unbounded. By choosing different loss functions, the learning algorithms can include a wide range of commonly used algorithms for regression. Unlike the prior work on theoretical analysis of unboun...

Full description

Saved in:
Bibliographic Details
Published inNeural computation Vol. 32; no. 10; pp. 1980 - 1997
Main Authors Tong, Hongzhi, Gao, Jiajing
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 01.10.2020
MIT Press Journals, The
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this letter, we study a class of the regularized regression algorithms when the sampling process is unbounded. By choosing different loss functions, the learning algorithms can include a wide range of commonly used algorithms for regression. Unlike the prior work on theoretical analysis of unbounded sampling, no constraint on the output variables is specified in our setting. By an elegant error analysis, we prove consistency and finite sample bounds on the excess risk of the proposed algorithms under regular conditions.
Bibliography:October, 2020
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0899-7667
1530-888X
DOI:10.1162/neco_a_01313