Differentiable samplers for deep latent variable models

Latent variable models are a popular class of models in statistics. Combined with neural networks to improve their expressivity, the resulting deep latent variable models have also found numerous applications in machine learning. A drawback of these models is that their likelihood function is intrac...

Full description

Saved in:
Bibliographic Details
Published inPhilosophical transactions of the Royal Society of London. Series A: Mathematical, physical, and engineering sciences Vol. 381; no. 2247; p. 20220147
Main Authors Doucet, Arnaud, Moulines, Eric, Thin, Achille
Format Journal Article
LanguageEnglish
Published England The Royal Society 15.05.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Latent variable models are a popular class of models in statistics. Combined with neural networks to improve their expressivity, the resulting deep latent variable models have also found numerous applications in machine learning. A drawback of these models is that their likelihood function is intractable so approximations have to be carried out to perform inference. A standard approach consists of maximizing instead an evidence lower bound (ELBO) obtained based on a variational approximation of the posterior distribution of the latent variables. The standard ELBO can, however, be a very loose bound if the variational family is not rich enough. A generic strategy to tighten such bounds is to rely on an unbiased low-variance Monte Carlo estimate of the evidence. We review here some recent importance sampling, Markov chain Monte Carlo and sequential Monte Carlo strategies that have been proposed to achieve this. This article is part of the theme issue 'Bayesian inference: challenges, perspectives, and prospects'.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-3
content type line 23
ObjectType-Review-1
One contribution of 16 to a theme issue ‘Bayesian inference: challenges, perspectives, and prospects’.
ISSN:1364-503X
1471-2962
DOI:10.1098/rsta.2022.0147