Federated learning: a deep learning model based on resnet18 dual path for lung nodule detection
Lung nodule detection is of vital importance in the prevention of lung cancer. In the past two decades, most machine learning and deep learning approaches have focused on training models using data collected and stored in centralised data repositories. However, as privacy security becoming more and...
Saved in:
Published in | Multimedia tools and applications Vol. 82; no. 11; pp. 17437 - 17450 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.05.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Lung nodule detection is of vital importance in the prevention of lung cancer. In the past two decades, most machine learning and deep learning approaches have focused on training models using data collected and stored in centralised data repositories. However, as privacy security becoming more and more important, patient data is scattered in different medical institutions on a small scale and fragmented. In this study, we proposed a federated learning method for training a lung nodule detection model on horizontally distributed data from different clients. In particular, the federated averaging algorithm is used to detect lung nodules by proposing a 3D ResNet18 Dual Path Faster R-CNN model. On this basis, we firstly considered that the quality of the data affects the model training effect. Therefore, we proposed a sampling-based content diversity algorithm that is validated on luna16 data, mitigating model overfitting and improving model generalisation with better results, and also reducing the training time of model. In order to further verify 3D ResNet18 Dual Path Faster R-CNN of federated learning algorithm, we compared it with other federated learning algorithms of deep learning. The experimental results show that the 3D ResNet18 Dual Path Faster R-CNN of federated learning algorithm achieves the best results. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-022-14107-0 |