Extensions of recurrent neural network language model

We present several modifications of the original recurrent neural net work language model (RNN LM). While this model has been shown to significantly outperform many competitive language modeling techniques in terms of accuracy, the remaining problem is the computational complexity. In this work, we...

Full description

Saved in:
Bibliographic Details
Published in2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 5528 - 5531
Main Authors Mikolov, Tomas, Kombrink, Stefan, Burget, Lukas, Cernocky, Jan Honza, Khudanpur, Sanjeev
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2011
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present several modifications of the original recurrent neural net work language model (RNN LM). While this model has been shown to significantly outperform many competitive language modeling techniques in terms of accuracy, the remaining problem is the computational complexity. In this work, we show approaches that lead to more than 15 times speedup for both training and testing phases. Next, we show importance of using a backpropagation through time algorithm. An empirical comparison with feedforward networks is also provided. In the end, we discuss possibilities how to reduce the amount of parameters in the model. The resulting RNN model can thus be smaller, faster both during training and testing, and more accurate than the basic one.
ISBN:9781457705380
1457705389
ISSN:1520-6149
DOI:10.1109/ICASSP.2011.5947611