SaTransformer: Semantic‐aware transformer for breast cancer classification and segmentation

Abstract Breast cancer classification and segmentation play an important role in identifying and detecting benign and malignant breast lesions. However, segmentation and classification still face many challenges: 1) The characteristics of cancer itself, such as fuzzy edges, complex backgrounds, and...

Full description

Saved in:
Bibliographic Details
Published inIET image processing Vol. 17; no. 13; pp. 3789 - 3800
Main Authors Zhang, Jie, Zhang, Zhichao, Liu, Hua, Xu, Shiqiang
Format Journal Article
LanguageEnglish
Published Wiley 01.11.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Abstract Breast cancer classification and segmentation play an important role in identifying and detecting benign and malignant breast lesions. However, segmentation and classification still face many challenges: 1) The characteristics of cancer itself, such as fuzzy edges, complex backgrounds, and significant changes in size, shape, and intensity distribution make accurate segment and classification challenges. 2) Existing methods ignore the potential relationship between classification and segmentation tasks, due to the classification and segmentation being treated as two separate tasks. To overcome these challenges, in this paper, a novel Semantic‐aware transformer (SaTransformer) for breast cancer classification and segmentation is proposed. Specifically, the SaTransformer enables doing the two takes simultaneously through one unified framework. Unlike existing well‐known methods, the segmentation and classification information are semantically interactive, reinforcing each other during feature representation learning and improving the ability of feature representation learning while consuming less memory and computational complexity. The SaTransformer is validated on two publicly available breast cancer datasets – BUSI and UDIAT. Experimental results and quantitative evaluations (accuracy: 97.97%, precision: 98.20%, DSC: 86.34%) demonstrate that the SaTransformer outperforms other state‐of‐the‐art methods.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12897