Multiview Variational Sparse Gaussian Processes

Gaussian process (GP) models are flexible nonparametric models widely used in a variety of tasks. Variational sparse GP (VSGP) scales GP models to large data sets by summarizing the posterior process with a set of inducing points. In this article, we extend VSGP to handle multiview data. We model ea...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 32; no. 7; pp. 2875 - 2885
Main Authors Mao, Liang, Sun, Shiliang
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.07.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gaussian process (GP) models are flexible nonparametric models widely used in a variety of tasks. Variational sparse GP (VSGP) scales GP models to large data sets by summarizing the posterior process with a set of inducing points. In this article, we extend VSGP to handle multiview data. We model each view with a VSGP and augment it with an additional set of inducing points. These VSGPs are coupled together by enforcing the means of their posteriors to agree at the locations of these inducing points. To learn these shared inducing points, we introduce an additional GP model that is defined in the concatenated feature space. Experiments on real-world data sets show that our multiview VSGP (MVSGP) model outperforms single-view VSGP consistently and is superior to state-of-the-art kernel-based multiview baselines for classification tasks.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2020.3008496