Malignant thoracic lymph node classification with deep convolutional neural networks on real-time endobronchial ultrasound (EBUS) images

Thoracic lymph node (LN) evaluation is essential for the accurate diagnosis of lung cancer and deciding the appropriate course of treatment. Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) is considered a standard method for mediastinal nodal staging. This study aims to...

Full description

Saved in:
Bibliographic Details
Published inTranslational lung cancer research Vol. 11; no. 1; pp. 14 - 23
Main Authors Yong, Seung Hyun, Lee, Sang Hoon, Oh, Sang-Il, Keum, Ji-Soo, Kim, Kyung Nam, Park, Moo Suk, Chang, Yoon Soo, Kim, Eun Young
Format Journal Article
LanguageEnglish
Published China AME Publishing Company 01.01.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Thoracic lymph node (LN) evaluation is essential for the accurate diagnosis of lung cancer and deciding the appropriate course of treatment. Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) is considered a standard method for mediastinal nodal staging. This study aims to build a deep convolutional neural network (CNN) for the automatic classification of metastatic malignancies involving thoracic LN, using EBUS-TBNA. Patients who underwent EBUS-TBNAs to assess the presence of malignancy in mediastinal LNs during a ten-month period at Severance Hospital, Seoul, Republic of Korea, were included in the study. Corresponding LN ultrasound images, pathology reports, demographic data, and clinical history were collected and analyzed. A total of 2,394 endobronchial ultrasound (EBUS) images of 1,459 benign LNs from 193 patients, and 935 malignant LNs from 177 patients, were collected. We employed the visual geometry group (VGG)-16 network to classify malignant LNs using only traditional cross-entropy for classification loss. The sensitivity, specificity, and accuracy of predicting malignancy were 69.7%, 74.3%, and 72.0%, respectively, and the overall area under the curve (AUC) was 0.782. We applied the new loss function to train the network and, using the modified VGG-16, the AUC improved to a value of 0.8. The sensitivity, specificity, and accuracy improved to 72.7%, 79.0%, and 75.8%, respectively. In addition, the proposed network can process 63 images per second on a single mainstream graphics processing unit (GPU) device, making it suitable for real-time analysis of EBUS images. Deep CNNs can effectively classify malignant LNs from EBUS images. Selecting LNs that require biopsy using real-time EBUS image analysis with deep learning is expected to shorten the EBUS-TBNA procedure time, increase lung cancer nodal staging accuracy, and improve patient safety.
Bibliography:ORCID: 0000-0002-3281-5744.
Contributions: (I) Conception and design: EY Kim, YS Chang; (II) Administrative support: MS Park, KN Kim; (III) Provision of study materials and patients: SH Yong, SH Lee, EY Kim; (IV) Collection and assembly of data: SH Yong, SH Lee; (V) Data analysis and interpretation: SH Yong, SI Oh, JS Keum; (VI) Manuscript writing: All authors; (VII) Final approval of manuscript: All authors.
ISSN:2218-6751
2226-4477
DOI:10.21037/tlcr-21-870