SAD-Net: a full spectral self-attention detail enhancement network for single image dehazing
Single-image dehazing technology plays a significant role in video surveillance and intelligent transportation. However, existing dehazing methods using vanilla convolution only extract features in the temporal domain and lack the ability to capture multi-directional information. To address the afor...
Saved in:
Published in | Scientific reports Vol. 15; no. 1; pp. 11875 - 13 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
07.04.2025
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Single-image dehazing technology plays a significant role in video surveillance and intelligent transportation. However, existing dehazing methods using vanilla convolution only extract features in the temporal domain and lack the ability to capture multi-directional information. To address the aforementioned issues, we design a new full spectral attention-based detail enhancement dehazing network, named SAD-Net. SAD-Net adopts a U-Net-like structure and integrates Spectral Detail Enhancement Convolution (SDEC) and Frequency-Guided Attention (FGA). SDEC combines wavelet transform and difference convolution(DC) to enhance high-frequency features while preserving low-frequency information. FGA detects haze-induced discrepancies and fine-tunes feature modulation. Experimental results show that SAD-Net outperforms six other dehazing networks on the Dense-Haze, NH-Haze, RESIDE and I-Haze datasets. Specifically, it increases the peak signal-to-noise ratio (PSNR) to 17.16 dB on the Dense-Haze dataset, surpassing the current state-of-the-art (SOTA) methods. Additionally, SAD-Net achieves excellent dehazing performance on an external dataset without any prior training. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 2045-2322 2045-2322 |
DOI: | 10.1038/s41598-025-92061-1 |