Multiple sclerosis identification by convolutional neural network with dropout and parametric ReLU

•Our improved convolutional neural network combined the parametric rectified linear unit and dropout techniques.•A 10-layer deep convolutional neural network was established, with 7 convolution layer and 3 fully connected layers.•Our method achieved a sensitivity of 98.22%, a specificity of 98.24%,...

Full description

Saved in:
Bibliographic Details
Published inJournal of computational science Vol. 28; pp. 1 - 10
Main Authors Zhang, Yu-Dong, Pan, Chichun, Sun, Junding, Tang, Chaosheng
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.09.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•Our improved convolutional neural network combined the parametric rectified linear unit and dropout techniques.•A 10-layer deep convolutional neural network was established, with 7 convolution layer and 3 fully connected layers.•Our method achieved a sensitivity of 98.22%, a specificity of 98.24%, and an accuracy of 98.23%.•Dropout increases the accuracy by 0.88% compared to not using dropout.•PReLU helped increase the accuracy by 1.91% compared to using ordinary ReLU. Multiple sclerosis is a condition affecting brain and/or spinal cord. Based on deep learning, this study aims to develop an improved convolutional neural network system. We collected 676 multiple sclerosis brain slices and 681 healthy control brain slices. Data augmentation was used to increase the size of training set. Our improved convolutional neural network combined the parametric rectified linear unit (PReLU) and dropout techniques. Finally, a 10-layer deep convolutional neural network was established, with 7 convolution layer and 3 fully connected layers. The retention probabilities of three dropout layers are set as 0.4, 0.5, and 0.5, respectively. Our method achieved a sensitivity of 98.22%, a specificity of 98.24%, and an accuracy of 98.23%. The dropout helped increase the accuracy by 0.88% compared to not using dropout. PReLU helped increase the accuracy by 1.92% compared to using ordinary ReLU, and by 1.48% compared to using leaky ReLU. This proposed method is superior to four state-of-the-art approaches.
ISSN:1877-7503
1877-7511
DOI:10.1016/j.jocs.2018.07.003