Contour-Aware contrastive learning for 3D knee segmentation from MR images

Automatic segmentation of knee MR images plays an important role in the diagnosis and treatment of knee osteoarthritis. Existing deep learning-based methods usually require considerable annotated samples, and manual labeling of knee MR images is tedious and time-consuming. To address the above probl...

Full description

Saved in:
Bibliographic Details
Published inPattern analysis and applications : PAA Vol. 28; no. 3
Main Authors Dong, Xianda, Zhang, Lei, Zhao, Xing, Zhou, Shoujun, Wang, Yuanquan, Xia, Jun, Zhang, Tao
Format Journal Article
LanguageEnglish
Published London Springer London 01.09.2025
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Automatic segmentation of knee MR images plays an important role in the diagnosis and treatment of knee osteoarthritis. Existing deep learning-based methods usually require considerable annotated samples, and manual labeling of knee MR images is tedious and time-consuming. To address the above problem, we propose a novel semi-supervised method, named as Contour-Aware Contrastive Learning Network (CACL-Net), for segmentation of femoral cartilage, tibial internal cartilage, and tibial external cartilage in knee MR images. The CACL-Net takes an encoder-decoder structure similar to the V-Net as backbone, and adopts a novel contrastive learning auxiliary task called Progressive Encoding Module, which can maintain the key high-level semantic information of the image and attenuate the irrelevant low-level semantic information. We also coin a contour-based self-attention module to prompt the network to pay more attention to edge details during the decoding process, thereby obtaining accurate segmentation results. Extensive experimental results demonstrate the proposed CACL-Net outperforms some other semi-supervised methods for knee MR image segmentation, and shows the potential usage of CACL-Net in the domain of semi-supervised segmentation problems. Our code is available at https://github.com/ldcdm/CLASS-Net .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-025-01494-x