STAMP: Simultaneous Training and Model Pruning for low data regimes in medical image segmentation
Acquisition of high quality manual annotations is vital for the development of segmentation algorithms. However, to create them we require a substantial amount of expert time and knowledge. Large numbers of labels are required to train convolutional neural networks due to the vast number of paramete...
Saved in:
Published in | Medical image analysis Vol. 81; p. 102583 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.10.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Acquisition of high quality manual annotations is vital for the development of segmentation algorithms. However, to create them we require a substantial amount of expert time and knowledge. Large numbers of labels are required to train convolutional neural networks due to the vast number of parameters that must be learned in the optimisation process. Here, we develop the STAMP algorithm to allow the simultaneous training and pruning of a UNet architecture for medical image segmentation with targeted channelwise dropout to make the network robust to the pruning. We demonstrate the technique across segmentation tasks and imaging modalities. It is then shown that, through online pruning, we are able to train networks to have much higher performance than the equivalent standard UNet models while reducing their size by more than 85% in terms of parameters. This has the potential to allow networks to be directly trained on datasets where very low numbers of labels are available.
[Display omitted]
•STAMP is an iterative framework to simultaneously train and prune CNN models.•Targeted dropout was found to stabilise model pruning between iterations.•STAMP pruned UNet architectures for segmentation, reducing model size more than 85%.•The most heavily pruned regions indicate redundancy in the original UNet.•STAMP leads to significant increase in performance in low data regimes. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1361-8415 1361-8423 |
DOI: | 10.1016/j.media.2022.102583 |