Chaos and Exponential Scale based Butterfly Optimization Technique for Feature Extraction and Selection in Pap Smear Images

Cervical cancer ranks as the second most common cancer in India and fourth globally. Many diagnosis procedures are available, and the pap-smear test is the most common where the cells from the cervix are undergone analysis. The service of an expert pathologist is required for the proper analysis and...

Full description

Saved in:
Bibliographic Details
Published in2023 10th International Conference on Computing for Sustainable Global Development (INDIACom) pp. 740 - 744
Main Authors Haridas, Soumya, T, Jayamalar
Format Conference Proceeding
LanguageEnglish
Published Bharati Vidyapeeth, New Delhi 15.03.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Cervical cancer ranks as the second most common cancer in India and fourth globally. Many diagnosis procedures are available, and the pap-smear test is the most common where the cells from the cervix are undergone analysis. The service of an expert pathologist is required for the proper analysis and classification of cells. Automation can improve the efficiency of the whole process. In the case of automated systems, the accuracy of diagnosis depends on the proper classification of cell images. The cell feature extraction and selection plays a major role in the whole process. In this paper the shape, Gray Level Co-occurrence Matrix, color intensity, Local Tetra Pattern, and Gabor features are extracted. The feature selection is carried out using Chaos and Exponential Scale-based Butterfly Optimization Algorithm. The improved version of Butterfly Optimization algorithm facilitates the search process. The system is evaluated using the Neural Network and Convolutional Neural Network classifiers. The selected features are given to the above mentioned classifiers. Based on the performance comparison, the proposed feature selection method works on CNN classifier giving better results as sensitivity(74.5%), Specificity(88.6%), and accuracy (81.8%).