Optimization of ReLU Activation Function for Deep-Learning-based Breast Cancer Classification on Mammogram Images

The use of the deep Convolutional Neural Network (CNN) in breast cancer classification of mammogram images has been widely investigated to aid radiologists in better clinical diagnoses. Multiple levels of convolution and non-linearity repetitions in CNN's architecture are required to extract si...

Full description

Saved in:
Bibliographic Details
Published in2024 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS) pp. 267 - 272
Main Authors Razali, Noor Fadzilah, Isa, Iza Sazanita, Sulaiman, Siti Noraini, Osman, Muhammad Khusairi, Karim, Noor Khairiah A., Nordin, Siti Aminah
Format Conference Proceeding
LanguageEnglish
Published IEEE 29.06.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The use of the deep Convolutional Neural Network (CNN) in breast cancer classification of mammogram images has been widely investigated to aid radiologists in better clinical diagnoses. Multiple levels of convolution and non-linearity repetitions in CNN's architecture are required to extract significant data to be represented. However, the vanishing gradient effect occurs when deeper network training as a product of the partial derivative of loss function on each weightage update can cause no meaningful network learning, even with additional epochs. Overcoming this using the activation function of rectified linear unit (ReLU) by allowing neurons to be activated to allow non-linearity when the output is more than zero could lessen the problem. However, restrictive allowance of non-linearity for <0 for final feature extraction when producing output probability on highly complex data such as mammogram images leads to dropped network performance. To overcome this, this study proposed an adaptive ReLU based on genetic algorithm (GA) profiling to determine the best threshold value for allowing neuron activation based on mutation and adaptation to improve the restrictive capability of the original ReLU. We modified the adaptive ReLU on the final learning layer of two CNN architectures and observed the performance on a public mammogram dataset of INbreast. Our experiments show improved accuracy from 95.0% to 98.5% and improved classification performance compared to other well-known activation functions. Applying evolutionary-based GAs to activation functions can represent an exciting frontier in meta-learning for neural networks.
ISSN:2995-2859
DOI:10.1109/I2CACIS61270.2024.10649623