Exponential ReLU DNN Expression of Holomorphic Maps in High Dimension
For a parameter dimension d ∈ N , we consider the approximation of many-parametric maps u : [ - 1 , 1 ] d → R by deep ReLU neural networks. The input dimension d may possibly be large, and we assume quantitative control of the domain of holomorphy of u : i.e., u admits a holomorphic extension to a B...
Saved in:
Published in | Constructive approximation Vol. 55; no. 1; pp. 537 - 582 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.02.2022
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | For a parameter dimension
d
∈
N
, we consider the approximation of many-parametric maps
u
:
[
-
1
,
1
]
d
→
R
by deep ReLU neural networks. The input dimension
d
may possibly be large, and we assume quantitative control of the domain of holomorphy of
u
: i.e.,
u
admits a holomorphic extension to a Bernstein polyellipse
E
ρ
1
×
⋯
×
E
ρ
d
⊂
C
d
of semiaxis sums
ρ
i
>
1
containing
[
-
1
,
1
]
d
. We establish the exponential rate
O
(
exp
(
-
b
N
1
/
(
d
+
1
)
)
)
of expressive power in terms of the total NN size
N
and of the input dimension
d
of the ReLU NN in
W
1
,
∞
(
[
-
1
,
1
]
d
)
. The constant
b
>
0
depends on
(
ρ
j
)
j
=
1
d
which characterizes the coordinate-wise sizes of the Bernstein-ellipses for
u
. We also prove exponential convergence in stronger norms for the approximation by DNNs with more regular, so-called “rectified power unit” activations. Finally, we extend DNN expression rate bounds also to two classes of non-holomorphic functions, in particular to
d
-variate, Gevrey-regular functions, and, by composition, to certain multivariate probability distribution functions with Lipschitz marginals. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0176-4276 1432-0940 |
DOI: | 10.1007/s00365-021-09542-5 |