SEGMENTATION OF CANCER MASSES ON BREAST ULTRASOUND IMAGES USING MODIFIED U-NET

Breast cancer causes a huge number of women’s deaths every year. The accurate localization of a breast lesion is a crucial stage. The segmentation of breast ultrasound images participates in the improvement of the process of detection of breast anomalies. An automatic approach of segmentation of bre...

Full description

Saved in:
Bibliographic Details
Published inInformatyka, automatyka, pomiary w gospodarce i ochronie środowiska Vol. 13; no. 3; pp. 11 - 15
Main Authors Khallassi, Ihssane, El Yousfi Alaoui, My Hachem, Jilbab, Abdelilah
Format Journal Article
LanguageEnglish
Published Lublin University of Technology 30.09.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Breast cancer causes a huge number of women’s deaths every year. The accurate localization of a breast lesion is a crucial stage. The segmentation of breast ultrasound images participates in the improvement of the process of detection of breast anomalies. An automatic approach of segmentation of breast ultrasound images is presented in this paper, the proposed model is a modified u-net called Attention Residual U-net, designed to help radiologists in their clinical examination to determine adequately the limitation of breast tumors. Attention Residual U-net is a combination of existing models (Convolutional Neural Network U-net, the Attention Gate Mechanism and the Residual Neural Network). Public breast ultrasound images dataset of Baheya hospital in Egypt is used in this work. Dice coefficient, Jaccard index and Accuracy are used to evaluate the performance of the proposed model on the test set. Attention residual u-net can significantly give a dice coefficient = 90%, Jaccard index = 76% and Accuracy = 90%. The proposed model is compared with two other breast segmentation methods on the same dataset. The results show that the modified U-net model was able to achieve accurate segmentation of breast lesions in breast ultrasound images.
ISSN:2083-0157
2391-6761
DOI:10.35784/iapgos.5319