Data-driven emergence of convolutional structure in neural networks
Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neurosci...
Saved in:
Published in | Proceedings of the National Academy of Sciences - PNAS Vol. 119; no. 40; p. e2201854119 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
United States
National Academy of Sciences
04.10.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 Author contributions: A.I. initiatied the study; and A.I. and S.G. designed research, performed research, contributed new reagents/analytic tools, analyzed data, and wrote the paper. Edited by Scott Kirkpatrick, The Hebrew University of Jerusalem, Jerusalem, Israel; received February 3, 2022; accepted August 12, 2022 by Editorial Board Member Terrence J. Sejnowski |
ISSN: | 0027-8424 1091-6490 1091-6490 |
DOI: | 10.1073/pnas.2201854119 |