Scale Normalization

One of the difficulties of training deep neural networks is caused by improper scaling between layers. Scaling issues introduce exploding / gradient problems, and have typically been addressed by careful scale-preserving initialization. We investigate the value of preserving scale, or isometry, beyo...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Lo, Henry Z, Amaral, Kevin, Ding, Wei
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 26.04.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:One of the difficulties of training deep neural networks is caused by improper scaling between layers. Scaling issues introduce exploding / gradient problems, and have typically been addressed by careful scale-preserving initialization. We investigate the value of preserving scale, or isometry, beyond the initial weights. We propose two methods of maintaing isometry, one exact and one stochastic. Preliminary experiments show that for both determinant and scale-normalization effectively speeds up learning. Results suggest that isometry is important in the beginning of learning, and maintaining it leads to faster learning.
ISSN:2331-8422