A Theoretical Analysis of Deep Neural Networks and Parametric PDEs

We derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric partial differential equations. In particular, without any knowledge of its concrete shape, we use the inherent low dimensionality of the solution manifold to obtain approximation rates whi...

Full description

Saved in:
Bibliographic Details
Published inConstructive approximation Vol. 55; no. 1; pp. 73 - 125
Main Authors Kutyniok, Gitta, Petersen, Philipp, Raslan, Mones, Schneider, Reinhold
Format Journal Article
LanguageEnglish
Published New York Springer US 01.02.2022
Springer Nature B.V
Springer
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric partial differential equations. In particular, without any knowledge of its concrete shape, we use the inherent low dimensionality of the solution manifold to obtain approximation rates which are significantly superior to those provided by classical neural network approximation results. Concretely, we use the existence of a small reduced basis to construct, for a large variety of parametric partial differential equations, neural networks that yield approximations of the parametric solution maps in such a way that the sizes of these networks essentially only depend on the size of the reduced basis.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
Constructive approximation
ISSN:0176-4276
1432-0940
1432-0940
DOI:10.1007/s00365-021-09551-4