The Variational Gaussian Approximation Revisited

The variational approximation of posterior distributions by multivariate gaussians has been much less popular in the machine learning community compared to the corresponding approximation by factorizing distributions. This is for a good reason: the gaussian approximation is in general plagued by an...

Full description

Saved in:
Bibliographic Details
Published inNeural computation Vol. 21; no. 3; pp. 786 - 792
Main Authors Opper, Manfred, Archambeau, Cédric
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 01.03.2009
MIT Press Journals, The
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The variational approximation of posterior distributions by multivariate gaussians has been much less popular in the machine learning community compared to the corresponding approximation by factorizing distributions. This is for a good reason: the gaussian approximation is in general plagued by an number of variational parameters to be optimized, being the number of random variables. In this letter, we discuss the relationship between the Laplace and the variational approximation, and we show that for models with gaussian priors and factorizing likelihoods, the number of variational parameters is actually . The approach is applied to gaussian process regression with nongaussian likelihoods.
Bibliography:March, 2009
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0899-7667
1530-888X
DOI:10.1162/neco.2008.08-07-592