Image Motion Blur Removal Algorithm Based on Generative Adversarial Network
The restoration of blurred images is a crucial topic in the field of machine vision, with far-reaching implications for enhancing information acquisition quality, improving algorithmic accuracy and enriching image texture. Efforts to mitigate the phenomenon of blur have progressed from statistical a...
Saved in:
Published in | Programming and computer software Vol. 50; no. 5; pp. 403 - 415 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Moscow
Pleiades Publishing
01.10.2024
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The restoration of blurred images is a crucial topic in the field of machine vision, with far-reaching implications for enhancing information acquisition quality, improving algorithmic accuracy and enriching image texture. Efforts to mitigate the phenomenon of blur have progressed from statistical approaches to those utilizing deep learning techniques. In this paper, we propose a Generative Adversarial Network (GAN)-based image restoration method to address the limitations of existing techniques in restoring color and detail in motion-blurred images. To reduce the computational complexity of generative adversarial networks and the vanishing gradient during learning, an U-net-based generator is used, and it is configured to emphasize the channel and spatial characteristics of the original information through a proposed CSAR(Channel and Spatial Attention Residual) blocks module rather than a simple concatenate operation. To validate the efficacy of the algorithm, comprehensive comparative experiments have been conducted on the GoPro dataset. Experimental results show that the peak signal-to-noise ratio is improved compared to SRN and MPRNet algorithms with good image restoration ability. Objects detection experiments using Yolo V3 showed that the proposed algorithms can generate deblerring images with higher information quality. |
---|---|
ISSN: | 0361-7688 1608-3261 |
DOI: | 10.1134/S0361768824700208 |