Stable ResNet
Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as...
Saved in:
Main Authors | , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
24.10.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Deep ResNet architectures have achieved state of the art performance on many
tasks. While they solve the problem of gradient vanishing, they might suffer
from gradient exploding as the depth becomes large (Yang et al. 2017).
Moreover, recent results have shown that ResNet might lose expressivity as the
depth goes to infinity (Yang et al. 2017, Hayou et al. 2019). To resolve these
issues, we introduce a new class of ResNet architectures, called Stable ResNet,
that have the property of stabilizing the gradient while ensuring expressivity
in the infinite depth limit. |
---|---|
DOI: | 10.48550/arxiv.2010.12859 |