Glaucoma detection using entropy sampling and ensemble learning for automatic optic cup and disc segmentation
•An ensemble learning based architecture to learn convolutional filters.•Use of boosting as a computationally efficient learning framework.•Accurate networks learned from few sample data. We present a novel method to segment retinal images using ensemble learning based convolutional neural network (...
Saved in:
Published in | Computerized medical imaging and graphics Vol. 55; pp. 28 - 41 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
United States
Elsevier Ltd
01.01.2017
Elsevier Science Ltd |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •An ensemble learning based architecture to learn convolutional filters.•Use of boosting as a computationally efficient learning framework.•Accurate networks learned from few sample data.
We present a novel method to segment retinal images using ensemble learning based convolutional neural network (CNN) architectures. An entropy sampling technique is used to select informative points thus reducing computational complexity while performing superior to uniform sampling. The sampled points are used to design a novel learning framework for convolutional filters based on boosting. Filters are learned in several layers with the output of previous layers serving as the input to the next layer. A softmax logistic classifier is subsequently trained on the output of all learned filters and applied on test images. The output of the classifier is subject to an unsupervised graph cut algorithm followed by a convex hull transformation to obtain the final segmentation. Our proposed algorithm for optic cup and disc segmentation outperforms existing methods on the public DRISHTI-GS data set on several metrics. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 0895-6111 1879-0771 1879-0771 |
DOI: | 10.1016/j.compmedimag.2016.07.012 |