Segmentation and Classification of Gastric Cancer from Endoscopic Image Dataset with the Aid of Artificial Intelligence
In order to detect early stomach malignancies, upper GI endoscopy is commonly used. An object detection model, a form of deep learning, was projected as a means of automating the diagnosis of early stomach cancer using endoscopic pictures. Yet there were difficulties in reducing false positives in t...
Saved in:
Published in | 2023 7th International Conference on Electronics, Communication and Aerospace Technology (ICECA) pp. 212 - 218 |
---|---|
Main Authors | , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
22.11.2023
|
Subjects | |
Online Access | Get full text |
DOI | 10.1109/ICECA58529.2023.10394686 |
Cover
Summary: | In order to detect early stomach malignancies, upper GI endoscopy is commonly used. An object detection model, a form of deep learning, was projected as a means of automating the diagnosis of early stomach cancer using endoscopic pictures. Yet there were difficulties in reducing false positives in the results that were detected. Thus, this research proposes an automatic classification method for stomach cancer classification using a (CNN) that has been pre-trained. To classify cancer in endoscopic pictures automatically, our method is superior to those that rely on traditional, manual features. Thirteen convolutional layers with small 33 size kernels and three fully linked layers make up the suggested model. We use the learning approach, which involves pre-training the weight of fine-tuning the weight of layers with the Slime Mould Algorithm, to deal with the data scarcity (SMA). Experiments employing 1208 photos from healthy subjects and 533 photographs from patients with stomach cancer examined detection performance using the 5-fold cross validation approach. These findings suggest the projected strategy will be effective for automating the diagnosis of early stomach cancer in endoscopic pictures. |
---|---|
DOI: | 10.1109/ICECA58529.2023.10394686 |