Data-driven emergence of convolutional structure in neural networks

Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neurosci...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the National Academy of Sciences - PNAS Vol. 119; no. 40; p. e2201854119
Main Authors Ingrosso, Alessandro, Goldt, Sebastian
Format Journal Article
LanguageEnglish
Published United States National Academy of Sciences 04.10.2022
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks.
AbstractList Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks.Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks.
Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks.
The interplay between data symmetries and network architecture is key for efficient learning in neural networks. Convolutional neural networks perform well in image recognition by exploiting the translation invariance of images. However, learning convolutional structure directly from data has proven elusive. Here we show how a neural network trained on translation-invariant data can autonomously develop a convolutional structure. Our work thus shows that neural networks can learn representations that exploit the data symmetries autonomously, by exploiting higher-order data statistics. We finally identify the maximization of non-Gaussianity as a guiding principle for representation learning in our model, linking discriminative vision tasks and unsupervised feature extraction. Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks.
Author Ingrosso, Alessandro
Goldt, Sebastian
Author_xml – sequence: 1
  givenname: Alessandro
  orcidid: 0000-0001-5430-7559
  surname: Ingrosso
  fullname: Ingrosso, Alessandro
  organization: Quantitative Life Sciences, The Abdus Salam International Centre for Theoretical Physics, 34151 Trieste, Italy
– sequence: 2
  givenname: Sebastian
  surname: Goldt
  fullname: Goldt, Sebastian
  organization: Department of Physics, International School of Advanced Studies, 34136 Trieste, Italy
BackLink https://www.ncbi.nlm.nih.gov/pubmed/36161906$$D View this record in MEDLINE/PubMed
BookMark eNpdkUlPwzAQhS0EgrKcuaFIXLgEZhxn8QUJlVVC4gJny3HHEEjtYidF_HtStZTlNNLMN08z7-2yTecdMXaIcIpQZmczp-Mp54BVLhDlBhshSEwLIWGTjQB4mVaCix22G-MrAMi8gm22kxVYoIRixMaXutPpJDRzcglNKTyTM5R4mxjv5r7tu8Y73SaxC73p-kBJ4xJHfRh6jroPH97iPtuyuo10sKp77On66nF8m94_3NyNL-5TIzh2acZNbnNrSyl4zWsyaCyfgEArLelaCGNNKbjWqKWUdWkAM5R6IvIarJA222PnS91ZX09pYsh1wxlqFpqpDp_K60b9nbjmRT37uZK5KPKqGgROVgLBv_cUOzVtoqG21Y58HxUvsSoEFDwf0ON_6Kvvw-DEguJYChjsG6izJWWCjzGQXR-DoBYBqUVA6iegYePo9w9r_juR7AsZZI_q
CitedBy_id crossref_primary_10_3390_ijms25063270
crossref_primary_10_1088_1742_5468_ad01b9
crossref_primary_10_1016_j_procs_2023_07_031
crossref_primary_10_1088_2632_2153_ad3330
crossref_primary_10_1103_PhysRevX_14_031001
crossref_primary_10_1073_pnas_2201854119
crossref_primary_10_3390_molecules28052410
crossref_primary_10_1103_PhysRevResearch_6_023057
crossref_primary_10_1038_s42256_023_00772_9
crossref_primary_10_1088_1742_5468_ad0a8c
crossref_primary_10_1103_PhysRevE_109_034305
Cites_doi 10.1109/ISIT.2017.8006580
10.1016/j.conb.2020.11.009
10.1007/s10827-011-0376-2
10.1017/CBO9781139020411
10.1038/381607a0
10.1007/BF00275687
10.1137/19M1299633
10.1016/S0893-6080(00)00026-5
10.1073/pnas.2018422118
10.1126/science.715444
10.1088/0305-4470/28/3/018
10.1109/MSP.2017.2693418
10.1088/0305-4470/26/11/001
10.1523/JNEUROSCI.18-07-02626.1998
10.1109/TPAMI.2012.230
10.1038/nrn3731
10.1088/0305-4470/26/15/017
10.1002/cpa.21413
10.1002/cpa.21422
10.1101/338947
10.1073/pnas.2201854119
10.1073/pnas.1403112111
10.1002/(SICI)1099-128X(199805/06)12:3<155::AID-CEM502>3.0.CO;2-5
10.1209/0295-5075/20/4/015
10.1038/14819
10.1103/PhysRevLett.66.2396
10.1016/j.neuron.2020.09.035
10.1007/978-3-642-15825-4_10
10.1088/0305-4470/36/41/002
10.1371/journal.pcbi.1008215
10.1016/S0079-6123(06)65031-0
10.1016/j.neunet.2021.03.010
10.1002/cpa.22008
10.1103/PhysRevLett.74.4337
10.1016/j.neuron.2012.01.010
10.1214/18-AOS1763
10.1088/1751-8121/ab7d00
10.1088/0305-4470/25/13/019
10.1038/nature14539
10.1113/jphysiol.1962.sp006837
10.1073/pnas.1820226116
10.1088/0305-4470/28/20/002
10.1137/07070111X
10.1103/PhysRevE.82.031135
10.1523/JNEUROSCI.4036-12.2012
10.1016/j.neunet.2014.09.003
10.1016/j.neunet.2020.08.022
10.3389/fpsyg.2017.01551
10.1088/0954-898X_7_4_004
10.1088/0305-4470/24/17/031
10.1016/0893-6080(89)90014-2
10.1109/CVPR.2016.90
10.1214/009117905000000233
10.1007/BF00337259
10.1017/CBO9781139164542
10.1152/jn.1987.58.6.1212
10.1093/cercor/10.9.910
10.1109/CVPR.2017.576
10.1088/0305-4470/25/5/020
ContentType Journal Article
Copyright Copyright National Academy of Sciences Oct 4, 2022
Copyright © 2022 the Author(s). Published by PNAS. 2022
Copyright_xml – notice: Copyright National Academy of Sciences Oct 4, 2022
– notice: Copyright © 2022 the Author(s). Published by PNAS. 2022
DBID CGR
CUY
CVF
ECM
EIF
NPM
AAYXX
CITATION
7QG
7QL
7QP
7QR
7SN
7SS
7T5
7TK
7TM
7TO
7U9
8FD
C1K
FR3
H94
M7N
P64
RC3
7X8
5PM
DOI 10.1073/pnas.2201854119
DatabaseName Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
CrossRef
Animal Behavior Abstracts
Bacteriology Abstracts (Microbiology B)
Calcium & Calcified Tissue Abstracts
Chemoreception Abstracts
Ecology Abstracts
Entomology Abstracts (Full archive)
Immunology Abstracts
Neurosciences Abstracts
Nucleic Acids Abstracts
Oncogenes and Growth Factors Abstracts
Virology and AIDS Abstracts
Technology Research Database
Environmental Sciences and Pollution Management
Engineering Research Database
AIDS and Cancer Research Abstracts
Algology Mycology and Protozoology Abstracts (Microbiology C)
Biotechnology and BioEngineering Abstracts
Genetics Abstracts
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
CrossRef
Virology and AIDS Abstracts
Oncogenes and Growth Factors Abstracts
Technology Research Database
Nucleic Acids Abstracts
Ecology Abstracts
Neurosciences Abstracts
Biotechnology and BioEngineering Abstracts
Environmental Sciences and Pollution Management
Entomology Abstracts
Genetics Abstracts
Animal Behavior Abstracts
Bacteriology Abstracts (Microbiology B)
Algology Mycology and Protozoology Abstracts (Microbiology C)
AIDS and Cancer Research Abstracts
Chemoreception Abstracts
Immunology Abstracts
Engineering Research Database
Calcium & Calcified Tissue Abstracts
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
Virology and AIDS Abstracts
CrossRef

MEDLINE
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
DeliveryMethod fulltext_linktorsrc
Discipline Sciences (General)
EISSN 1091-6490
ExternalDocumentID 10_1073_pnas_2201854119
36161906
Genre Journal Article
GroupedDBID ---
-DZ
-~X
.55
0R~
123
29P
2FS
2WC
4.4
53G
5RE
5VS
85S
AACGO
AAFWJ
AANCE
ABOCM
ABPLY
ABPPZ
ABTLG
ABZEH
ACGOD
ACIWK
ACNCT
ACPRK
AENEX
AFFNX
AFOSN
AFRAH
ALMA_UNASSIGNED_HOLDINGS
BKOMP
CGR
CS3
CUY
CVF
D0L
DIK
DU5
E3Z
EBS
ECM
EIF
F5P
FRP
GX1
H13
HH5
HYE
JLS
JSG
KQ8
L7B
LU7
N9A
NPM
N~3
O9-
OK1
PNE
PQQKQ
R.V
RHF
RHI
RNA
RNS
RPM
RXW
SJN
TAE
TN5
UKR
VQA
W8F
WH7
WOQ
WOW
X7M
XSW
Y6R
YBH
YKV
YSK
ZCA
~02
~KM
AAYXX
CITATION
7QG
7QL
7QP
7QR
7SN
7SS
7T5
7TK
7TM
7TO
7U9
8FD
C1K
FR3
H94
M7N
P64
RC3
7X8
5PM
ID FETCH-LOGICAL-c421t-32c5f5ff7942b2bec1cf2d041f9feab44cfc742aa1a999b7c01319ad45b0f49f3
IEDL.DBID RPM
ISSN 0027-8424
1091-6490
IngestDate Tue Sep 17 21:31:39 EDT 2024
Thu Sep 05 16:31:21 EDT 2024
Fri Sep 13 00:18:10 EDT 2024
Fri Aug 23 03:32:30 EDT 2024
Wed Oct 02 05:30:13 EDT 2024
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 40
Keywords convolution
receptive fields
invariance
neural networks
Language English
License This article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c421t-32c5f5ff7942b2bec1cf2d041f9feab44cfc742aa1a999b7c01319ad45b0f49f3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Author contributions: A.I. initiatied the study; and A.I. and S.G. designed research, performed research, contributed new reagents/analytic tools, analyzed data, and wrote the paper.
Edited by Scott Kirkpatrick, The Hebrew University of Jerusalem, Jerusalem, Israel; received February 3, 2022; accepted August 12, 2022 by Editorial Board Member Terrence J. Sejnowski
ORCID 0000-0001-5430-7559
OpenAccessLink https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9546588/
PMID 36161906
PQID 2721740906
PQPubID 42026
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_9546588
proquest_miscellaneous_2718640625
proquest_journals_2721740906
crossref_primary_10_1073_pnas_2201854119
pubmed_primary_36161906
PublicationCentury 2000
PublicationDate 2022-10-04
PublicationDateYYYYMMDD 2022-10-04
PublicationDate_xml – month: 10
  year: 2022
  text: 2022-10-04
  day: 04
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Washington
PublicationTitle Proceedings of the National Academy of Sciences - PNAS
PublicationTitleAlternate Proc Natl Acad Sci U S A
PublicationYear 2022
Publisher National Academy of Sciences
Publisher_xml – name: National Academy of Sciences
References e_1_3_4_3_2
Ocker G. K. (e_1_3_4_70_2) 2021
e_1_3_4_61_2
LeCun Y. (e_1_3_4_14_2) 1990
Anandkumar A. (e_1_3_4_69_2) 2014; 15
e_1_3_4_84_2
e_1_3_4_7_2
e_1_3_4_80_2
e_1_3_4_23_2
e_1_3_4_27_2
e_1_3_4_65_2
e_1_3_4_88_2
e_1_3_4_72_2
e_1_3_4_30_2
Weiler M. (e_1_3_4_29_2) 2018
Kalimeris D. (e_1_3_4_66_2) 2019
e_1_3_4_91_2
e_1_3_4_11_2
e_1_3_4_57_2
e_1_3_4_53_2
e_1_3_4_15_2
e_1_3_4_38_2
e_1_3_4_76_2
e_1_3_4_99_2
e_1_3_4_2_2
e_1_3_4_85_2
e_1_3_4_6_2
e_1_3_4_81_2
e_1_3_4_43_2
Urban G. (e_1_3_4_19_2) 2017
e_1_3_4_24_2
Neyshabur B. (e_1_3_4_42_2) 2019
e_1_3_4_47_2
e_1_3_4_89_2
e_1_3_4_28_2
Krizhevsky A. (e_1_3_4_20_2) 2012
Richard E. (e_1_3_4_82_2) 2014
e_1_3_4_101_2
e_1_3_4_73_2
e_1_3_4_96_2
e_1_3_4_50_2
e_1_3_4_92_2
e_1_3_4_12_2
e_1_3_4_58_2
e_1_3_4_54_2
e_1_3_4_31_2
e_1_3_4_16_2
e_1_3_4_35_2
Goldt S. (e_1_3_4_59_2) 2020; 10
e_1_3_4_39_2
Montanari A. (e_1_3_4_83_2) 2015
e_1_3_4_1_2
e_1_3_4_9_2
e_1_3_4_63_2
e_1_3_4_40_2
e_1_3_4_5_2
Favero A. (e_1_3_4_46_2) 2021
e_1_3_4_44_2
Kossaifi J. (e_1_3_4_102_2) 2019; 20
Goldt S. (e_1_3_4_100_2) 2019
e_1_3_4_21_2
e_1_3_4_48_2
e_1_3_4_25_2
e_1_3_4_67_2
Karklin Y. (e_1_3_4_36_2) 2011
e_1_3_4_93_2
e_1_3_4_104_2
e_1_3_4_74_2
e_1_3_4_51_2
e_1_3_4_55_2
e_1_3_4_32_2
e_1_3_4_97_2
e_1_3_4_13_2
e_1_3_4_78_2
e_1_3_4_17_2
Mallat S. (e_1_3_4_34_2) 2016; 374
Monasson R. (e_1_3_4_77_2) 1993; 3
Loureiro B. (e_1_3_4_62_2) 2021
Bell A. (e_1_3_4_95_2) 1996
e_1_3_4_60_2
e_1_3_4_8_2
e_1_3_4_41_2
e_1_3_4_4_2
e_1_3_4_22_2
e_1_3_4_45_2
e_1_3_4_68_2
e_1_3_4_26_2
e_1_3_4_49_2
e_1_3_4_64_2
e_1_3_4_87_2
Goodfellow I. (e_1_3_4_18_2) 2016
Potters M. (e_1_3_4_90_2) 2020
e_1_3_4_71_2
e_1_3_4_94_2
e_1_3_4_103_2
e_1_3_4_52_2
e_1_3_4_79_2
e_1_3_4_33_2
Perry A. (e_1_3_4_86_2) 2020; 56
e_1_3_4_10_2
e_1_3_4_75_2
e_1_3_4_98_2
e_1_3_4_37_2
e_1_3_4_56_2
References_xml – volume-title: Advances in Neural Information Processing Systems
  year: 2019
  ident: e_1_3_4_100_2
  contributor:
    fullname: Goldt S.
– ident: e_1_3_4_57_2
– ident: e_1_3_4_87_2
– ident: e_1_3_4_26_2
– ident: e_1_3_4_84_2
  doi: 10.1109/ISIT.2017.8006580
– ident: e_1_3_4_56_2
– ident: e_1_3_4_93_2
  doi: 10.1016/j.conb.2020.11.009
– ident: e_1_3_4_10_2
  doi: 10.1007/s10827-011-0376-2
– ident: e_1_3_4_97_2
  doi: 10.1017/CBO9781139020411
– ident: e_1_3_4_13_2
  doi: 10.1038/381607a0
– ident: e_1_3_4_60_2
– start-page: 18137
  volume-title: Advances in Neural Information Processing Systems
  year: 2021
  ident: e_1_3_4_62_2
  contributor:
    fullname: Loureiro B.
– ident: e_1_3_4_91_2
– ident: e_1_3_4_71_2
  doi: 10.1007/BF00275687
– ident: e_1_3_4_25_2
– ident: e_1_3_4_103_2
  doi: 10.1137/19M1299633
– ident: e_1_3_4_96_2
  doi: 10.1016/S0893-6080(00)00026-5
– ident: e_1_3_4_37_2
  doi: 10.1073/pnas.2018422118
– ident: e_1_3_4_44_2
– ident: e_1_3_4_41_2
– ident: e_1_3_4_6_2
  doi: 10.1126/science.715444
– ident: e_1_3_4_54_2
  doi: 10.1088/0305-4470/28/3/018
– ident: e_1_3_4_61_2
– ident: e_1_3_4_55_2
– volume-title: Engineers and Data Scientists
  year: 2020
  ident: e_1_3_4_90_2
  contributor:
    fullname: Potters M.
– volume-title: International Conference on Learning Representations
  year: 2017
  ident: e_1_3_4_19_2
  contributor:
    fullname: Urban G.
– ident: e_1_3_4_30_2
  doi: 10.1109/MSP.2017.2693418
– ident: e_1_3_4_79_2
  doi: 10.1088/0305-4470/26/11/001
– ident: e_1_3_4_8_2
  doi: 10.1523/JNEUROSCI.18-07-02626.1998
– start-page: 1097
  volume-title: Advances in Neural Information Processing Systems
  year: 2012
  ident: e_1_3_4_20_2
  contributor:
    fullname: Krizhevsky A.
– ident: e_1_3_4_33_2
  doi: 10.1109/TPAMI.2012.230
– ident: e_1_3_4_12_2
  doi: 10.1038/nrn3731
– ident: e_1_3_4_80_2
  doi: 10.1088/0305-4470/26/15/017
– volume-title: Advances in Neural Information Processing Systems
  year: 1996
  ident: e_1_3_4_95_2
  contributor:
    fullname: Bell A.
– ident: e_1_3_4_32_2
  doi: 10.1002/cpa.21413
– volume: 10
  start-page: 041044
  year: 2020
  ident: e_1_3_4_59_2
  article-title: Modeling the influence of data structure on learning in neural networks: The hidden manifold model
  publication-title: Phys. Rev. X
  contributor:
    fullname: Goldt S.
– ident: e_1_3_4_23_2
– ident: e_1_3_4_76_2
– ident: e_1_3_4_89_2
  doi: 10.1002/cpa.21422
– ident: e_1_3_4_40_2
  doi: 10.1101/338947
– ident: e_1_3_4_98_2
– ident: e_1_3_4_104_2
  doi: 10.1073/pnas.2201854119
– volume: 20
  start-page: 1
  year: 2019
  ident: e_1_3_4_102_2
  article-title: Tensorly: Tensor learning in Python
  publication-title: J. Mach. Learn. Res.
  contributor:
    fullname: Kossaifi J.
– ident: e_1_3_4_45_2
– ident: e_1_3_4_2_2
  doi: 10.1073/pnas.1403112111
– ident: e_1_3_4_67_2
  doi: 10.1002/(SICI)1099-128X(199805/06)12:3<155::AID-CEM502>3.0.CO;2-5
– ident: e_1_3_4_63_2
  doi: 10.1209/0295-5075/20/4/015
– volume: 374
  start-page: 20150203
  year: 2016
  ident: e_1_3_4_34_2
  article-title: Understanding deep convolutional networks
  publication-title: Philos. Trans.- Royal Soc., Math. Phys. Eng. Sci.
  contributor:
    fullname: Mallat S.
– ident: e_1_3_4_11_2
  doi: 10.1038/14819
– ident: e_1_3_4_48_2
  doi: 10.1103/PhysRevLett.66.2396
– ident: e_1_3_4_3_2
  doi: 10.1016/j.neuron.2020.09.035
– ident: e_1_3_4_16_2
  doi: 10.1007/978-3-642-15825-4_10
– ident: e_1_3_4_75_2
  doi: 10.1088/0305-4470/36/41/002
– ident: e_1_3_4_92_2
  doi: 10.1371/journal.pcbi.1008215
– ident: e_1_3_4_9_2
  doi: 10.1016/S0079-6123(06)65031-0
– ident: e_1_3_4_38_2
  doi: 10.1016/j.neunet.2021.03.010
– ident: e_1_3_4_58_2
  doi: 10.1002/cpa.22008
– ident: e_1_3_4_53_2
  doi: 10.1103/PhysRevLett.74.4337
– ident: e_1_3_4_65_2
– ident: e_1_3_4_1_2
  doi: 10.1016/j.neuron.2012.01.010
– start-page: 9456
  volume-title: Advances in Neural Information Processing Systems
  year: 2021
  ident: e_1_3_4_46_2
  contributor:
    fullname: Favero A.
– ident: e_1_3_4_85_2
  doi: 10.1214/18-AOS1763
– ident: e_1_3_4_39_2
  doi: 10.1088/1751-8121/ab7d00
– start-page: 999
  volume-title: Advances in Neural Information Processing Systems
  year: 2011
  ident: e_1_3_4_36_2
  contributor:
    fullname: Karklin Y.
– ident: e_1_3_4_78_2
  doi: 10.1088/0305-4470/25/13/019
– ident: e_1_3_4_15_2
  doi: 10.1038/nature14539
– ident: e_1_3_4_5_2
  doi: 10.1113/jphysiol.1962.sp006837
– volume-title: Advances in Neural Information Processing Systems
  year: 2014
  ident: e_1_3_4_82_2
  contributor:
    fullname: Richard E.
– ident: e_1_3_4_101_2
– ident: e_1_3_4_51_2
  doi: 10.1073/pnas.1820226116
– ident: e_1_3_4_99_2
  doi: 10.1088/0305-4470/28/20/002
– volume: 56
  start-page: 230
  year: 2020
  ident: e_1_3_4_86_2
  article-title: Statistical limits of spiked tensor models
  publication-title: Ann. Inst. Henri Poincare Probab. Stat.
  contributor:
    fullname: Perry A.
– ident: e_1_3_4_68_2
  doi: 10.1137/07070111X
– start-page: 396
  volume-title: Advances in Neural Information Processing Systems
  year: 1990
  ident: e_1_3_4_14_2
  contributor:
    fullname: LeCun Y.
– ident: e_1_3_4_50_2
– ident: e_1_3_4_43_2
  doi: 10.1103/PhysRevE.82.031135
– volume-title: Advances in Neural Information Processing Systems
  year: 2019
  ident: e_1_3_4_66_2
  contributor:
    fullname: Kalimeris D.
– ident: e_1_3_4_35_2
  doi: 10.1523/JNEUROSCI.4036-12.2012
– volume: 15
  start-page: 2773
  year: 2014
  ident: e_1_3_4_69_2
  article-title: Tensor decompositions for learning latent variable models
  publication-title: J. Mach. Learn. Res.
  contributor:
    fullname: Anandkumar A.
– start-page: 8078
  volume-title: Advances in Neural Information Processing Systems
  year: 2019
  ident: e_1_3_4_42_2
  contributor:
    fullname: Neyshabur B.
– ident: e_1_3_4_17_2
  doi: 10.1016/j.neunet.2014.09.003
– start-page: 11326
  volume-title: Advances in Neural Information Processing Systems
  year: 2021
  ident: e_1_3_4_70_2
  contributor:
    fullname: Ocker G. K.
– ident: e_1_3_4_52_2
  doi: 10.1016/j.neunet.2020.08.022
– ident: e_1_3_4_4_2
  doi: 10.3389/fpsyg.2017.01551
– ident: e_1_3_4_73_2
  doi: 10.1088/0954-898X_7_4_004
– ident: e_1_3_4_31_2
– ident: e_1_3_4_81_2
  doi: 10.1088/0305-4470/24/17/031
– ident: e_1_3_4_47_2
  doi: 10.1016/0893-6080(89)90014-2
– ident: e_1_3_4_22_2
  doi: 10.1109/CVPR.2016.90
– ident: e_1_3_4_88_2
  doi: 10.1214/009117905000000233
– ident: e_1_3_4_72_2
  doi: 10.1007/BF00337259
– ident: e_1_3_4_94_2
– volume-title: Deep Learning
  year: 2016
  ident: e_1_3_4_18_2
  contributor:
    fullname: Goodfellow I.
– ident: e_1_3_4_64_2
  doi: 10.1017/CBO9781139164542
– volume: 3
  start-page: 1141
  year: 1993
  ident: e_1_3_4_77_2
  article-title: Storage of spatially correlated patterns in autoassociative memories
  publication-title: J. Phys. I
  contributor:
    fullname: Monasson R.
– ident: e_1_3_4_7_2
  doi: 10.1152/jn.1987.58.6.1212
– volume-title: Advances in Neural Information Processing Systems
  year: 2015
  ident: e_1_3_4_83_2
  contributor:
    fullname: Montanari A.
– start-page: 31
  volume-title: Advances in Neural Information Processing Systems
  year: 2018
  ident: e_1_3_4_29_2
  contributor:
    fullname: Weiler M.
– ident: e_1_3_4_74_2
  doi: 10.1093/cercor/10.9.910
– ident: e_1_3_4_21_2
– ident: e_1_3_4_24_2
– ident: e_1_3_4_28_2
  doi: 10.1109/CVPR.2017.576
– ident: e_1_3_4_27_2
– ident: e_1_3_4_49_2
  doi: 10.1088/0305-4470/25/5/020
SSID ssj0009580
Score 2.5678844
Snippet Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover...
The interplay between data symmetries and network architecture is key for efficient learning in neural networks. Convolutional neural networks perform well in...
SourceID pubmedcentral
proquest
crossref
pubmed
SourceType Open Access Repository
Aggregation Database
Index Database
StartPage e2201854119
SubjectTerms Artificial neural networks
Biological Sciences
Circuits
Deep learning
Machine Learning
Nervous system
Neural networks
Neural Networks, Computer
Neurosciences
Object recognition
Pattern formation
Physical Sciences
Receptive field
Tensors
Tiling
Translation
Visual discrimination
Visual pathways
Title Data-driven emergence of convolutional structure in neural networks
URI https://www.ncbi.nlm.nih.gov/pubmed/36161906
https://www.proquest.com/docview/2721740906/abstract/
https://www.proquest.com/docview/2718640625/abstract/
https://pubmed.ncbi.nlm.nih.gov/PMC9546588
Volume 119
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV07T8MwED5BJxbEm0CpgsQAQ9rGj9oZUQHxEIiBSmyR49iiEnWrtvx_znnRwsYa20l0d7a_k7_7DHBBckWljFVkmeARo1ZFinAZDUSOS7LPAPJC7fNlcD9ij-_8fQN4XQtTkPZ1Nu66z0nXjT8KbuVsons1T6z3-jxM_A3emLltwqagtE7RG6VdWdadEFx-GWG1no-gvZlTiy7BLU9yFsdeMJQOEPEk_rqj1V3pD9T8zZhc2YLudmC7wo7hdfmPu7Bh3B7sVrNzEV5WEtJX-zC8UUsV5XO_lIWmqrA04dSGnmVeRRu-qlSP_ZqbcOxCL22Jz1xJDF8cwOju9m14H1XXJUSakXgZUaK55dbiDCMZQd_E2pK8z2KbWKMyxrTVmAgrFStEhZnQXmonUTnjWd-yxNJDaLmpM8cQSqYRNhlrOKcMUx6pVV-zxIvfWyKMDOCyNlc6K1Ux0uI0W9DUGzn9MXIA7dqcaTU9sFn4TKiPhg_gvGnGwPanFcqZ6ZfvE8sBwg3CAzgqrd98q3ZbAGLNL00HL5q93oKxVIhnV7Fz8u-Rp7BFfAmEJxGwNrTQUeYMgcky6yAkf3jqFOH4DXH85D0
link.rule.ids 230,315,733,786,790,891,27957,27958,53827,53829
linkProvider National Library of Medicine
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3JTsMwEB2xHODCvoQ1SBzgkLRx7No5ogIqqzgA4hY5ji0qqFvR9sLXM85SthNcM86m57Fn5DdvAA5JLmMhIhkYyllAYyMDSZgIWjzHJdllAHmh9nnb6jzQyyf2NAWsroUpSPsq64b2tRfa7nPBrRz0VKPmiTXubtqJ6-CNmds0zKK_El4n6ROtXVFWnhBcgCmhtaIPjxsDK4chwU1PMBpFTjI0bmHMk7iGR1_3pV_B5k_O5JdN6HwRHuvPL7knL-F4lIXq_Yey45__bwkWqrDUPynNyzCl7QosV44_9I8qderjVWifypEM8je3Svq6Kt7Uft_4jsBeTWR8VClMO37Tftf6TjUTr9mScz5cg4fzs_t2J6g6MQSKkmgUxEQxw4xB5yUZQdgjZUjepJFJjJYZpcoozLGljCQGnBlXTsUnkTllWdPQxMTrMGP7Vm-CL6jCiEwbzVhMMZsSSjYVTZyuviFcCw-OahzSQSm4kRYH5TxOHXrpJ3oe7NQ4pZXnoZm7JKuJiHpwMDGjz7iDEGl1f-zGRKKFkQxhHmyUsE7eVc8HD_g3wCcDnB73dwvCWOhyV7Bt_fvOfZjr3N9cp9cXt1fbME9cpYXjKtAdmEHQ9C7GP6Nsr5jtH4ksBVg
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3JTsMwEB1BkRAX9iWsQeIAh6SNYzfOERUq1qoHkBCXyHFsUUHdqsuFr2ecpRS4cY2dRNbzjGfkN28AzkgmQs4D4WkaMY-GWniCMO41owxdss0Aslzts9O8eaZ3L-xlrtVXTtqXac83H33f9N5ybuWwL-sVT6zefWzFtoM3Zm7DTNcXYQltlsRVoj7T2-VF9QlBJ0wJrVR9orA-NGLsEzz4OKNBYGVDwybGPbFtejR_Nv0JOH_zJucOovYavFZLKPgn7_50kvry85e647_WuA6rZXjqXhZTNmBBmU3YKB3A2D0vVaovtqB1JSbCy0bWW7qqLOJU7kC7lshebmj8VCFQOx0pt2dcq56Jz0zBPR9vw3P7-ql145UdGTxJSTDxQiKZZlqjEZOUIPyB1CRr0EDHWomUUqkl5tpCBAIDzzSSVs0nFhllaUPTWIc7UDMDo_bA5VRiZKa0YiykmFVxKRqSxlZfX5NIcQfOKyySYSG8keQX5lGYWASTbwQdOKywSkoLxOHIJlsNRNWB09kw2o69EBFGDaZ2TsCbGNEQ5sBuAe3sX9WecCD6AfpsgtXl_jmCUOb63CV0-_9-8wSWu1ft5OG2c38AK8QWXFjKAj2EGmKmjjAMmqTH-Yb_AryCB9g
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Data-driven+emergence+of+convolutional+structure+in+neural+networks&rft.jtitle=Proceedings+of+the+National+Academy+of+Sciences+-+PNAS&rft.au=Ingrosso%2C+Alessandro&rft.au=Goldt%2C+Sebastian&rft.date=2022-10-04&rft.pub=National+Academy+of+Sciences&rft.issn=0027-8424&rft.eissn=1091-6490&rft.volume=119&rft.issue=40&rft_id=info:doi/10.1073%2Fpnas.2201854119&rft_id=info%3Apmid%2F36161906&rft.externalDBID=PMC9546588
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0027-8424&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0027-8424&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0027-8424&client=summon