Quantitative Universal Approximation Bounds for Deep Belief Networks
We show that deep belief networks with binary hidden units can approximate any multivariate probability density under very mild integrability requirements on the parental density of the visible nodes. The approximation is measured in the $L^q$-norm for $q\in[1,\infty]$ ($q=\infty$ corresponding to t...
Saved in:
Main Authors | , |
---|---|
Format | Journal Article |
Language | English |
Published |
18.08.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We show that deep belief networks with binary hidden units can approximate
any multivariate probability density under very mild integrability requirements
on the parental density of the visible nodes. The approximation is measured in
the $L^q$-norm for $q\in[1,\infty]$ ($q=\infty$ corresponding to the supremum
norm) and in Kullback-Leibler divergence. Furthermore, we establish sharp
quantitative bounds on the approximation error in terms of the number of hidden
units. |
---|---|
DOI: | 10.48550/arxiv.2208.09033 |