Neural Conditional Probability for Inference

We introduce NCP (Neural Conditional Probability), a novel operator-theoretic approach for learning conditional distributions with a particular focus on inference tasks. NCP can be used to build conditional confidence regions and extract important statistics like conditional quantiles, mean, and cov...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Kostic, Vladimir R, Lounici, Karim, Gregoire Pacreau, Novelli, Pietro, Turri, Giacomo, Pontil, Massimiliano
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 01.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We introduce NCP (Neural Conditional Probability), a novel operator-theoretic approach for learning conditional distributions with a particular focus on inference tasks. NCP can be used to build conditional confidence regions and extract important statistics like conditional quantiles, mean, and covariance. It offers streamlined learning through a single unconditional training phase, facilitating efficient inference without the need for retraining even when conditioning changes. By tapping into the powerful approximation capabilities of neural networks, our method efficiently handles a wide variety of complex probability distributions, effectively dealing with nonlinear relationships between input and output variables. Theoretical guarantees ensure both optimization consistency and statistical accuracy of the NCP method. Our experiments show that our approach matches or beats leading methods using a simple Multi-Layer Perceptron (MLP) with two hidden layers and GELU activations. This demonstrates that a minimalistic architecture with a theoretically grounded loss function can achieve competitive results without sacrificing performance, even in the face of more complex architectures.
ISSN:2331-8422