Dropout with Tabu Strategy for Regularizing Deep Neural Networks
Dropout has proven to be an effective technique for regularization and preventing the co-adaptation of neurons in deep neural networks (DNN). It randomly drops units with a probability \(p\) during the training stage of DNN. Dropout also provides a way of approximately combining exponentially many d...
Saved in:
Published in | arXiv.org |
---|---|
Main Authors | , , , , |
Format | Paper |
Language | English |
Published |
Ithaca
Cornell University Library, arXiv.org
29.08.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!