Iterative shrinkage thresholding-based anti-multi-noise compression perceptual image reconstruction network

Telemedicine imaging services usually require wireless transmission of a large number of medical images MRI/CT, etc., in the network, which are subject to noise interference and block effect during transmission and compression, leading to degradation of image reconstruction quality, thus affecting d...

Full description

Saved in:
Bibliographic Details
Published inSignal, image and video processing Vol. 18; no. 5; pp. 4569 - 4578
Main Authors Xiang, Jianhong, Liang, Qiming, Xu, Hao, Wang, Linyu, Liu, Yang
Format Journal Article
LanguageEnglish
Published London Springer London 01.07.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1863-1703
1863-1711
DOI10.1007/s11760-024-03095-3

Cover

Loading…
More Information
Summary:Telemedicine imaging services usually require wireless transmission of a large number of medical images MRI/CT, etc., in the network, which are subject to noise interference and block effect during transmission and compression, leading to degradation of image reconstruction quality, thus affecting diagnostic accuracy. In this paper, iterative shrinkage thresholding (ISTA)-based anti-noise compressive perceptual image reconstruction network is proposed to solve the problem. The network adopts a constrained sparse model, which incorporates both orthogonal and binary constraints of the sampling matrix into the network; in addition, the network adopts feature extraction subnetwork, parameter initialization subnetwork, and reconstruction anti-noise subnetwork for compressed perceptual image reconstruction, incorporates the channel attention mechanism, and proposes a hybrid network for anti-noise deblocking in the reconstruction anti-noise subnetwork. Experiments show that the maximum peak signal-to-noise ratio achievable by the network is in the range of 29.70–39.31 dB under different noise interference scenarios.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-024-03095-3