An efficient convolutional neural network-based classifier for an imbalanced oral squamous carcinoma cell dataset

Imbalanced datasets pose a major challenge for the researchers while addressing machine learning tasks. In these types of datasets, samples of different classes are not in equal proportion rather the gap between the numbers of individual class samples is significantly large. Classification models pe...

Full description

Saved in:
Bibliographic Details
Published inIAES international journal of artificial intelligence Vol. 13; no. 1; p. 487
Main Authors Mohapatra, Usha Manasi, Tripathy, Sushreeta
Format Journal Article
LanguageEnglish
Published 01.03.2024
Online AccessGet full text

Cover

Loading…
More Information
Summary:Imbalanced datasets pose a major challenge for the researchers while addressing machine learning tasks. In these types of datasets, samples of different classes are not in equal proportion rather the gap between the numbers of individual class samples is significantly large. Classification models perform better for datasets having equal proportion of data tuples in both the classes. But, in reality, the medical image datasets are skewed and hence are not always suitable for a model to achieve improved classification performance. Therefore, various techniques have been suggested in the literature to overcome this challenge. This paper applies oversampling technique on an imbalanced dataset and focuses on a customized convolutional neural network model that classifies the images into two categories: diseased and non-diseased. Outcome of the proposed model can assist the health experts in the detection of oral cancer. The proposed model exhibits 99% accuracy after data augmentation. Performance metrics such as precision, recall and F1-score values are very close to 1. In addition, statistical test is performed to validate the statistical significance of the model. It has been found that the proposed model is an optimised classifier in terms of number of network layers and number of neurons.
ISSN:2089-4872
2252-8938
DOI:10.11591/ijai.v13.i1.pp487-499