Robust Bayesian Inference for Discrete Outcomes with the Total Variation Distance

Models of discrete-valued outcomes are easily misspecified if the data exhibit zero-inflation, overdispersion or contamination. Without additional knowledge about the existence and nature of this misspecification, model inference and prediction are adversely affected. Here, we introduce a robust dis...

Full description

Saved in:
Bibliographic Details
Main Authors Knoblauch, Jeremias, Vomfell, Lara
Format Journal Article
LanguageEnglish
Published 26.10.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Models of discrete-valued outcomes are easily misspecified if the data exhibit zero-inflation, overdispersion or contamination. Without additional knowledge about the existence and nature of this misspecification, model inference and prediction are adversely affected. Here, we introduce a robust discrepancy-based Bayesian approach using the Total Variation Distance (TVD). In the process, we address and resolve two challenges: First, we study convergence and robustness properties of a computationally efficient estimator for the TVD between a parametric model and the data-generating mechanism. Second, we provide an efficient inference method adapted from Lyddon et al. (2019) which corresponds to formulating an uninformative nonparametric prior directly over the data-generating mechanism. Lastly, we empirically demonstrate that our approach is robust and significantly improves predictive performance on a range of simulated and real world data.
DOI:10.48550/arxiv.2010.13456