Alternating Synthetic and Real Gradients for Neural Language Modeling
Training recurrent neural networks (RNNs) with backpropagation through time (BPTT) has known drawbacks such as being difficult to capture longterm dependencies in sequences. Successful alternatives to BPTT have not yet been discovered. Recently, BP with synthetic gradients by a decoupled neural inte...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , |
Format | Paper |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
03.06.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!