Deep learning-based apical lesion segmentation from panoramic radiographs

Purpose: Convolutional neural networks (CNNs) have rapidly emerged as one of the most promising artificial intelligence methods in the field of medical and dental research. CNNs can provide an effective diagnostic methodology allowing for the detection of early-staged diseases. Therefore, this study...

Full description

Saved in:
Bibliographic Details
Published inImaging science in dentistry Vol. 52; no. 52; pp. 351 - 357
Main Authors Il-Seok, Song, Hak-Kyun, Shin, Ju-Hee, Kang, Jo-Eun, Kim, Kyung-Hoe, Huh, Won-Jin, Yi, Sam-Sun, Lee, Min-Suk, Heo
Format Journal Article
LanguageKorean
Published 2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Purpose: Convolutional neural networks (CNNs) have rapidly emerged as one of the most promising artificial intelligence methods in the field of medical and dental research. CNNs can provide an effective diagnostic methodology allowing for the detection of early-staged diseases. Therefore, this study aimed to evaluate the performance of a deep CNN algorithm for apical lesion segmentation from panoramic radiographs. Materials and Methods: A total of 1000 panoramic images showing apical lesions were separated into training (n=800, 80%), validation (n=100, 10%), and test (n=100, 10%) datasets. The performance of identifying apical lesions was evaluated by calculating the precision, recall, and F1-score. Results: In the test group of 180 apical lesions, 147 lesions were segmented from panoramic radiographs with an intersection over union (IoU) threshold of 0.3. The F1-score values, as a measure of performance, were 0.828, 0.815, and 0.742, respectively, with IoU thresholds of 0.3, 0.4, and 0.5. Conclusion: This study showed the potential utility of a deep learning-guided approach for the segmentation of apical lesions. The deep CNN algorithm using U-Net demonstrated considerably high performance in detecting apical lesions.
Bibliography:KISTI1.1003/JNL.JAKO202209769598996
ISSN:2233-7822
2233-7830