Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality: A Review

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is n...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of automation and computing Vol. 14; no. 5; pp. 503 - 519
Main Authors Poggio, Tomaso, Mhaskar, Hrushikesh, Rosasco, Lorenzo, Miranda, Brando, Liao, Qianli
Format Journal Article
LanguageEnglish
Published Beijing Institute of Automation, Chinese Academy of Sciences 01.10.2017
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1476-8186
2153-182X
1751-8520
2153-1838
DOI10.1007/s11633-017-1054-2

Cover

Loading…
More Information
Summary:The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.
Bibliography:11-5350/TP
Machine learning, neural networks, deep and shallow networks, convolutional neural networks, function approximation deep learning.
The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1476-8186
2153-182X
1751-8520
2153-1838
DOI:10.1007/s11633-017-1054-2