MPFCNet: multi-scale parallel feature fusion convolutional network for 3D knee segmentation from MR images

Accurate and automatic segmentation of knee magnetic resonance (MR) images plays a vital role in the diagnosis of osteoarthritis and knee bone diseases. However, the anatomical structure of the knee joint is complex, it is difficult to segment knee joints accurately and efficiently. This paper propo...

Full description

Saved in:
Bibliographic Details
Published inPattern analysis and applications : PAA Vol. 28; no. 2
Main Authors Zhang, Hanzheng, Wu, Qing, Zhao, Xing, Wang, Yuanquan, Zhou, Shoujun, Zhang, Lei, Zhang, Tao
Format Journal Article
LanguageEnglish
Published London Springer London 01.06.2025
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Accurate and automatic segmentation of knee magnetic resonance (MR) images plays a vital role in the diagnosis of osteoarthritis and knee bone diseases. However, the anatomical structure of the knee joint is complex, it is difficult to segment knee joints accurately and efficiently. This paper proposes a knee joint segmentation model from MR image, which is named a multi-scale parallel feature fusion convolutional network (MPFCNet). A Large Kernel Attention (LKA) module is coined in the MPFCNet, which effectively increases the receptive field and preserves detail textures, resulting in better feature extraction. To further utilize complementary information at various scales in both spatial and channel dimensions, a Multi-Scale Fusion (MSF) module is established. A Hybrid Feedforward Attention (HFA) module is proposed to establish long-range dependencies. Experiments and comparisons with state-of-the-art methods were conducted on the publicly available dataset OAI-ZIB. The results show that the MPFCNet achieved excellent segmentation results on the knee joint segmentation task, improving the average dice similarity coefficient.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-025-01437-6