Stable ResNet

Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Hayou, Soufiane, Clerico, Eugenio, He, Bobby, Deligiannidis, George, Doucet, Arnaud, Rousseau, Judith
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 18.03.2021
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as the depth goes to infinity (Yang et al. 2017, Hayou et al. 2019). To resolve these issues, we introduce a new class of ResNet architectures, called Stable ResNet, that have the property of stabilizing the gradient while ensuring expressivity in the infinite depth limit.
ISSN:2331-8422