Super‐resolution reconstruction algorithm for medical images by fusion of wavelet transform and multi‐scale adaptive feature selection

Abstract Conventional computed tomography (CT) images often suffer from blurred edges and unclear details. Image super‐resolution methods can significantly enhance CT image quality, thereby improving diagnostic accuracy. To better extract detailed features and enhance the cascading effects of differ...

Full description

Saved in:
Bibliographic Details
Published inIET image processing
Main Authors Wang, QiaoSu, Ma, Qiaomei
Format Journal Article
LanguageEnglish
Published 01.10.2024
Online AccessGet full text

Cover

Loading…
More Information
Summary:Abstract Conventional computed tomography (CT) images often suffer from blurred edges and unclear details. Image super‐resolution methods can significantly enhance CT image quality, thereby improving diagnostic accuracy. To better extract detailed features and enhance the cascading effects of different feature levels, we propose a novel medical image super‐resolution algorithm that integrates discrete wavelet transform and multi‐scale adaptive feature selection. Our approach uses both the low‐resolution image and its high‐frequency component from the frequency domain as network inputs, with the high‐frequency component providing learning supervision, which enhances detail fidelity in reconstruction. Additionally, we introduce a multi‐scale adaptive feature selection module to learn from different layers of CT images and their inter‐layer correlations. Finally, the pixel information is efficiently integrated by a coordinate attention mechanism incorporating the concept of squeeze excitation. Experimental results show that our method outperforms state‐of‐the‐art methods, achieving superior reconstruction at scale factors of 2, 4, and 8, especially at scale factor 8, where it surpasses others by 1.12 in PSNR, 0.0145 in SSIM, and 0.0038 in LPIPS. Visually, our method also delivers more accurate details and better perceptual quality.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.13252