Robust, randomized preconditioning for kernel ridge regression

This paper investigates two randomized preconditioning techniques for solving kernel ridge regression (KRR) problems with a medium to large number of data points (\(10^4 \leq N \leq 10^7\)), and it introduces two new methods with state-of-the-art performance. The first method, RPCholesky preconditio...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Díaz, Mateo, Epperly, Ethan N, Frangella, Zachary, Tropp, Joel A, Webber, Robert J
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 10.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper investigates two randomized preconditioning techniques for solving kernel ridge regression (KRR) problems with a medium to large number of data points (\(10^4 \leq N \leq 10^7\)), and it introduces two new methods with state-of-the-art performance. The first method, RPCholesky preconditioning, accurately solves the full-data KRR problem in \(O(N^2)\) arithmetic operations, assuming sufficiently rapid polynomial decay of the kernel matrix eigenvalues. The second method, KRILL preconditioning, offers an accurate solution to a restricted version of the KRR problem involving \(k \ll N\) selected data centers at a cost of \(O((N + k^2) k \log k)\) operations. The proposed methods solve a broad range of KRR problems, making them ideal for practical applications.
ISSN:2331-8422