Landmark-Based Ensemble Learning with Random Fourier Features and Gradient Boosting
This paper jointly leverages two state-of-the-art learning stra-tegies—gradient boosting (GB) and kernel Random Fourier Features (RFF)—to address the problem of kernel learning. Our study builds on a recent result showing that one can learn a distribution over the RFF to produce a new kernel suited...
Saved in:
Published in | Machine Learning and Knowledge Discovery in Databases Vol. 12459; pp. 141 - 157 |
---|---|
Main Authors | , , , , , , |
Format | Book Chapter |
Language | English |
Published |
Switzerland
Springer International Publishing AG
2021
Springer International Publishing |
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper jointly leverages two state-of-the-art learning stra-tegies—gradient boosting (GB) and kernel Random Fourier Features (RFF)—to address the problem of kernel learning. Our study builds on a recent result showing that one can learn a distribution over the RFF to produce a new kernel suited for the task at hand. For learning this distribution, we exploit a GB scheme expressed as ensembles of RFF weak learners, each of them being a kernel function designed to fit the residual. Unlike Multiple Kernel Learning techniques that make use of a pre-computed dictionary of kernel functions to select from, at each iteration we fit a kernel by approximating it from the training data as a weighted sum of RFF. This strategy allows one to build a classifier based on a small ensemble of learned kernel “landmarks” better suited for the underlying application. We conduct a thorough experimental analysis to highlight the advantages of our method compared to both boosting-based and kernel-learning state-of-the-art methods. |
---|---|
Bibliography: | Electronic supplementary materialThe online version of this chapter (https://doi.org/10.1007/978-3-030-67664-3_9) contains supplementary material, which is available to authorized users. |
ISBN: | 3030676633 9783030676636 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-030-67664-3_9 |