An explainable AI-driven deep neural network for accurate breast cancer detection from histopathological and ultrasound images

Breast cancer represents a significant global health challenge, which makes it essential to detect breast cancer early and accurately to improve patient prognosis and reduce mortality rates. However, traditional diagnostic processes relying on manual analysis of medical images are inherently complex...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 15; no. 1; pp. 17531 - 34
Main Authors Alom, Md. Romzan, Farid, Fahmid Al, Rahaman, Muhammad Aminur, Rahman, Anichur, Debnath, Tanoy, Miah, Abu Saleh Musa, Mansor, Sarina
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 20.05.2025
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Breast cancer represents a significant global health challenge, which makes it essential to detect breast cancer early and accurately to improve patient prognosis and reduce mortality rates. However, traditional diagnostic processes relying on manual analysis of medical images are inherently complex and subject to variability between observers, highlighting the urgent need for robust automated breast cancer detection systems. While deep learning has demonstrated potential, many current models struggle with limited accuracy and lack of interpretability. This research introduces the Deep Neural Breast Cancer Detection (DNBCD) model, an explainable AI-based framework that utilizes deep learning methods for classifying breast cancer using histopathological and ultrasound images. The proposed model employs Densenet121 as a foundation, integrating customized Convolutional Neural Network (CNN) layers including GlobalAveragePooling2D, Dense, and Dropout layers along with transfer learning to achieve both high accuracy and interpretability for breast cancer diagnosis. The proposed DNBCD model integrates several preprocessing techniques, including image normalization and resizing, and augmentation techniques to enhance the model’s robustness and address class imbalances using class weight. It employs Grad-CAM (Gradient-weighted Class Activation Mapping) to offer visual justifications for its predictions, increasing trust and transparency among healthcare providers. The model was assessed using two benchmark datasets: Breakhis-400x (B-400x) and Breast Ultrasound Images Dataset (BUSI) containing 1820 and 1578 images, respectively. We systematically divided the datasets into training (70%), testing (20%,) and validation (10%) sets, ensuring efficient model training and evaluation obtaining accuracies of 93.97% for B-400x dataset having benign and malignant classes and 89.87% for BUSI dataset having benign, malignant, and normal classes for breast cancer detection. Experimental results demonstrate that the proposed DNBCD model significantly outperforms existing state-of-the-art approaches with potential uses in clinical environments. We also made all the materials publicly accessible for the research community at: https://github.com/romzanalom/XAI-Based-Deep-Neural-Breast-Cancer-Detection .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-025-97718-5