Longitudinal multiple sclerosis lesion segmentation: Resource and challenge
In conjunction with the ISBI 2015 conference, we organized a longitudinal lesion segmentation challenge providing training and test data to registered participants. The training data consisted of five subjects with a mean of 4.4 time-points, and test data of fourteen subjects with a mean of 4.4 time...
Saved in:
Published in | NeuroImage (Orlando, Fla.) Vol. 148; pp. 77 - 102 |
---|---|
Main Authors | , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
Elsevier Inc
01.03.2017
Elsevier Limited Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In conjunction with the ISBI 2015 conference, we organized a longitudinal lesion segmentation challenge providing training and test data to registered participants. The training data consisted of five subjects with a mean of 4.4 time-points, and test data of fourteen subjects with a mean of 4.4 time-points. All 82 data sets had the white matter lesions associated with multiple sclerosis delineated by two human expert raters. Eleven teams submitted results using state-of-the-art lesion segmentation algorithms to the challenge, with ten teams presenting their results at the conference. We present a quantitative evaluation comparing the consistency of the two raters as well as exploring the performance of the eleven submitted results in addition to three other lesion segmentation algorithms. The challenge presented three unique opportunities: (1) the sharing of a rich data set; (2) collaboration and comparison of the various avenues of research being pursued in the community; and (3) a review and refinement of the evaluation metrics currently in use. We report on the performance of the challenge participants, as well as the construction and evaluation of a consensus delineation. The image data and manual delineations will continue to be available for download, through an evaluation website22The Challenge Evaluation Website is: http://smart-stats-tools.org/lesion-challenge-2015 as a resource for future researchers in the area. This data resource provides a platform to compare existing methods in a fair and consistent manner to each other and multiple manual raters.
•Public lesion data base of 21 training data sets and 61 testing data sets.•Fully automated evaluation website.•Comparison between 14 state-of-the-art algorithms and 2 manual delineators. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 PMCID: PMC5344762 These authors co-organized the challenge, all others contributed results. |
ISSN: | 1053-8119 1095-9572 |
DOI: | 10.1016/j.neuroimage.2016.12.064 |