Computer-Aided Diagnosis (CAD) to Detect Abnormality on CT Image of Liver
Liver cancer on CT-scan image has different shapes, locations and textures in every image. The contrast difference between abnormal and healthy liver is often indistinguishable, making it difficult to evaluate. Liver abnormalities are such as swelling, fibrosis, and the presence of benign or maligna...
Saved in:
Published in | Journal of physics. Conference series Vol. 1505; no. 1; pp. 12005 - 12011 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Bristol
IOP Publishing
01.03.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Liver cancer on CT-scan image has different shapes, locations and textures in every image. The contrast difference between abnormal and healthy liver is often indistinguishable, making it difficult to evaluate. Liver abnormalities are such as swelling, fibrosis, and the presence of benign or malignant tumor. The difference of low contrast with wide size on the image is easily known as abnormality, but it is very hard to evaluate for small mass and low contrast. In this research, CAD was conducted to help the evaluation on liver abnormality, especially abnormality in small size. The research method used was active contour-based segmentation method. The research data were secondary data, the abdomen image was produced from the modality of Computed Tomography Scanner (CT-Scan) in Regional Public Hospital of Cibinong, Bogor. The data collection techniques were through observation on the image data of abnormal liver from either liver cancer patients, normal liver patients, as well as patients of other diseases as diagnosed by the doctor. Meanwhile, the data was processed through feature extraction process using the texture analysis of Gray-Level Co-occurrence Matrix (GLCM) with machine learning of Artificial Neural Network (ANN) to detect abnormality on image. The research stated that ANN can be used to categorize the images into normal and abnormal groups at 89% accuracy, 86% sensitivity, 92% specificity, 91% precision, and 10% overall error. |
---|---|
ISSN: | 1742-6588 1742-6596 |
DOI: | 10.1088/1742-6596/1505/1/012005 |