A Novel Approach for Increased Convolutional Neural Network Performance in Gastric-Cancer Classification Using Endoscopic Images

Gastric cancer is the third-most-common cause of cancer-related deaths in the world. Fortunately, it can be detected using endoscopy equipment. Computer-aided diagnosis (CADx) systems can help clinicians identify cancer from gastric diseases more accurately. In this paper, we present a CADx system t...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 9; pp. 51847 - 51854
Main Authors Lee, Sin-Ae, Cho, Hyun Chin, Cho, Hyun-Chong
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gastric cancer is the third-most-common cause of cancer-related deaths in the world. Fortunately, it can be detected using endoscopy equipment. Computer-aided diagnosis (CADx) systems can help clinicians identify cancer from gastric diseases more accurately. In this paper, we present a CADx system that distinguishes and classifies gastric cancer from pre-cancerous conditions, such as gastric polyps, gastric ulcers, gastritis, and bleeding. The system uses a deep-learning model, Xception, which involves depth-wise separable convolutions, to classify cancer and non-cancers. The proposed method consists of two steps: Google's AutoAugment for augmentation and the simple linear iterative clustering (SLIC) superpixel and fast and robust fuzzy C-means (FRFCM) algorithm for image segmentation during preprocessing. These approaches produce a feasible method of distinguishing and classifying cancers from other gastric diseases. Based on biopsy-supported ground truth, the performance metrics of the area under the receiver operating characteristic curve (i.e. Az) are measured on the test sets. Based on the classification results, the Az of the proposed classification model is 0.96, which is 0.06 up from 0.90 which is the Az of the original data. Our methods are fully automated without the manual specification of region-of-interests for the test and with a random selection of images for model training. This methodology may play a crucial role in selecting effective treatment options without the need for a surgical biopsy.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3069747