Neural Conditional Probability for Inference
We introduce NCP (Neural Conditional Probability), a novel operator-theoretic approach for learning conditional distributions with a particular focus on inference tasks. NCP can be used to build conditional confidence regions and extract important statistics like conditional quantiles, mean, and cov...
Saved in:
Main Authors | , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
01.07.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We introduce NCP (Neural Conditional Probability), a novel operator-theoretic
approach for learning conditional distributions with a particular focus on
inference tasks. NCP can be used to build conditional confidence regions and
extract important statistics like conditional quantiles, mean, and covariance.
It offers streamlined learning through a single unconditional training phase,
facilitating efficient inference without the need for retraining even when
conditioning changes. By tapping into the powerful approximation capabilities
of neural networks, our method efficiently handles a wide variety of complex
probability distributions, effectively dealing with nonlinear relationships
between input and output variables. Theoretical guarantees ensure both
optimization consistency and statistical accuracy of the NCP method. Our
experiments show that our approach matches or beats leading methods using a
simple Multi-Layer Perceptron (MLP) with two hidden layers and GELU
activations. This demonstrates that a minimalistic architecture with a
theoretically grounded loss function can achieve competitive results without
sacrificing performance, even in the face of more complex architectures. |
---|---|
DOI: | 10.48550/arxiv.2407.01171 |