Dropout with Tabu Strategy for Regularizing Deep Neural Networks

Dropout has proven to be an effective technique for regularization and preventing the co-adaptation of neurons in deep neural networks (DNN). It randomly drops units with a probability \(p\) during the training stage of DNN. Dropout also provides a way of approximately combining exponentially many d...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Ma, Zongjie, Sattar, Abdul, Zhou, Jun, Chen, Qingliang, Su, Kaile
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 29.08.2018
Subjects
Online AccessGet full text

Cover

Loading…