Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks

We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. H...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Geist, Moritz, Petersen, Philipp, Raslan, Mones, Schneider, Reinhold, Kutyniok, Gitta
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 25.04.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. Here, approximation theory predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation. We use various methods to establish comparability between test-cases by minimizing the effect of the choice of test-cases on the optimization and sampling aspects of the learning problem. We find strong support for the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis.
ISSN:2331-8422