Scalable Kernel Learning Via the Discriminant Information

Kernel approximation methods create explicit, low-dimensional kernel feature maps to deal with the high computational and memory complexity of standard techniques. This work studies a supervised kernel learning methodology to optimize such mappings. We utilize the Discriminant Information criterion,...

Full description

Saved in:
Bibliographic Details
Published inICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 3152 - 3156
Main Authors Al, Mert, Hou, Zejiang, Kung, Sun-Yuan
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Kernel approximation methods create explicit, low-dimensional kernel feature maps to deal with the high computational and memory complexity of standard techniques. This work studies a supervised kernel learning methodology to optimize such mappings. We utilize the Discriminant Information criterion, a measure of class separability with a strong connection to Discriminant Analysis. By generalizing this measure to cover a wider range of kernel maps and learning settings, we develop scalable methods to learn kernel features with high discriminant power. Experimental results on several datasets showcase that our techniques can improve optimization and generalization performances over state of the art kernel learning methods.
ISSN:2379-190X
DOI:10.1109/ICASSP40776.2020.9053142