Transition to Linearity of General Neural Networks with Directed Acyclic Graph Architecture
In this paper we show that feedforward neural networks corresponding to arbitrary directed acyclic graphs undergo transition to linearity as their "width" approaches infinity. The width of these general networks is characterized by the minimum in-degree of their neurons, except for the inp...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
24.05.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper we show that feedforward neural networks corresponding to
arbitrary directed acyclic graphs undergo transition to linearity as their
"width" approaches infinity. The width of these general networks is
characterized by the minimum in-degree of their neurons, except for the input
and first layers. Our results identify the mathematical structure underlying
transition to linearity and generalize a number of recent works aimed at
characterizing transition to linearity or constancy of the Neural Tangent
Kernel for standard architectures. |
---|---|
DOI: | 10.48550/arxiv.2205.11786 |