On the approximation of functions by tanh neural networks

We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function. These bounds provide explicit estimates on the approximation error with respect to the size o...

Full description

Saved in:
Bibliographic Details
Published inNeural networks Vol. 143; pp. 732 - 750
Main Authors De Ryck, Tim, Lanthaler, Samuel, Mishra, Siddhartha
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.11.2021
Subjects
Online AccessGet full text
ISSN0893-6080
1879-2782
1879-2782
DOI10.1016/j.neunet.2021.08.015

Cover

Loading…
More Information
Summary:We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function. These bounds provide explicit estimates on the approximation error with respect to the size of the neural networks. We show that tanh neural networks with only two hidden layers suffice to approximate functions at comparable or better rates than much deeper ReLU neural networks. •Explicit bounds for function approximation in Sobolev norms by tanh neural networks.•Tanh networks with 2 hidden layers are at least as expressive as deeper ReLU networks.•Improved convergence rate for neural network approximation of analytic functions.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0893-6080
1879-2782
1879-2782
DOI:10.1016/j.neunet.2021.08.015