FiLM-Ensemble: Probabilistic Deep Learning via Feature-wise Linear Modulation

The ability to estimate epistemic uncertainty is often crucial when deploying machine learning in the real world, but modern methods often produce overconfident, uncalibrated uncertainty predictions. A common approach to quantify epistemic uncertainty, usable across a wide class of prediction models...

Full description

Saved in:
Bibliographic Details
Main Authors Turkoglu, Mehmet Ozgur, Becker, Alexander, Gündüz, Hüseyin Anil, Rezaei, Mina, Bischl, Bernd, Daudt, Rodrigo Caye, D'Aronco, Stefano, Wegner, Jan Dirk, Schindler, Konrad
Format Journal Article
LanguageEnglish
Published 31.05.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The ability to estimate epistemic uncertainty is often crucial when deploying machine learning in the real world, but modern methods often produce overconfident, uncalibrated uncertainty predictions. A common approach to quantify epistemic uncertainty, usable across a wide class of prediction models, is to train a model ensemble. In a naive implementation, the ensemble approach has high computational cost and high memory demand. This challenges in particular modern deep learning, where even a single deep network is already demanding in terms of compute and memory, and has given rise to a number of attempts to emulate the model ensemble without actually instantiating separate ensemble members. We introduce FiLM-Ensemble, a deep, implicit ensemble method based on the concept of Feature-wise Linear Modulation (FiLM). That technique was originally developed for multi-task learning, with the aim of decoupling different tasks. We show that the idea can be extended to uncertainty quantification: by modulating the network activations of a single deep network with FiLM, one obtains a model ensemble with high diversity, and consequently well-calibrated estimates of epistemic uncertainty, with low computational overhead in comparison. Empirically, FiLM-Ensemble outperforms other implicit ensemble methods, and it and comes very close to the upper bound of an explicit ensemble of networks (sometimes even beating it), at a fraction of the memory cost.
DOI:10.48550/arxiv.2206.00050