A Novel Framework for Whole-Slide Pathological Image Classification Based on the Cascaded Attention Mechanism

This study introduces an innovative deep learning framework to address the limitations of traditional pathological image analysis and the pressing demand for medical resources in tumor diagnosis. With the global rise in cancer cases, manual examination by pathologists is increasingly inadequate, bei...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 25; no. 3; p. 726
Main Authors Liu, Dehua, Hu, Bin
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 01.02.2025
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This study introduces an innovative deep learning framework to address the limitations of traditional pathological image analysis and the pressing demand for medical resources in tumor diagnosis. With the global rise in cancer cases, manual examination by pathologists is increasingly inadequate, being both time-consuming and subject to the scarcity of professionals and individual subjectivity, thus impacting diagnostic accuracy and efficiency. Deep learning, particularly in computer vision, offers significant potential to mitigate these challenges. Automated models can rapidly and accurately process large datasets, revolutionizing tumor detection and classification. However, existing methods often rely on single attention mechanisms, failing to fully exploit the complexity of pathological images, especially in extracting critical features from whole-slide images. We developed a framework incorporating a cascaded attention mechanism, enhancing meaningful pattern recognition while suppressing irrelevant background information. Experiments on the Camelyon16 dataset demonstrate superior classification accuracy, model generalization, and result interpretability compared to state-of-the-art techniques. This advancement promises to enhance diagnostic efficiency, reduce healthcare costs, and improve patient outcomes.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s25030726