An Infinite Restricted Boltzmann Machine
We present a mathematical construction for the restricted Boltzmann machine (RBM) that does not require specifying the number of hidden units. In fact, the hidden layer size is adaptive and can grow during training. This is obtained by first extending the RBM to be sensitive to the ordering of its h...
Saved in:
Published in | Neural computation Vol. 28; no. 7; pp. 1265 - 1288 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
One Rogers Street, Cambridge, MA 02142-1209, USA
MIT Press
01.07.2016
MIT Press Journals, The |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We present a mathematical construction for the restricted Boltzmann machine (RBM) that does not require specifying the number of hidden units. In fact, the hidden layer size is adaptive and can grow during training. This is obtained by first extending the RBM to be sensitive to the ordering of its hidden units. Then, with a carefully chosen definition of the energy function, we show that the limit of infinitely many hidden units is well defined. As with RBM, approximate maximum likelihood training can be performed, resulting in an algorithm that naturally and adaptively adds trained hidden units during learning. We empirically study the behavior of this infinite RBM, showing that its performance is competitive to that of the RBM, while not requiring the tuning of a hidden layer size. |
---|---|
Bibliography: | July, 2016 ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0899-7667 1530-888X |
DOI: | 10.1162/NECO_a_00848 |