Mamba-convolution hybrid network for underwater image enhancement
Underwater imagery frequently exhibits low clarity and is subject to significant color distortion as a result of the inherent conditions of the marine environment and variations in illumination. Such degradation in image quality fundamentally undermines the efficacy of marine ecological monitoring a...
Saved in:
Published in | Scientific reports Vol. 15; no. 1; p. 31975 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
30.08.2025
Nature Publishing Group |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Underwater imagery frequently exhibits low clarity and is subject to significant color distortion as a result of the inherent conditions of the marine environment and variations in illumination. Such degradation in image quality fundamentally undermines the efficacy of marine ecological monitoring and the detection of underwater targets. To address this issue, we present a Mamba-Convolution network for Underwater Image Enhancement (MC-UIE). Concretely, we first use a standard convolution layer with a 3
3 kernel to obtain initial image feature maps. Then, we develop an iterable Mamba-Convolution Hybrid Block (M-C HB) to enhance the global and local dependency of image feature maps. The core of the M-C HB is the 2D Selective Scan (SS2D) and Feature Attention Module (FAM), which can more efficiently learn the global and local dependency of images. After that, a Cross Fusion Mamba Block (CFMB) is designed to fuse image feature maps of different levels. Finally, extensive qualitative and quantitative experiments on mainstream datasets demonstrate that the proposed method significantly outperforms existing methods in color, illumination, and detail restoration. Our code and results are available at:
https://github.com/WYJGR/MC-Net/
. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2045-2322 |
DOI: | 10.1038/s41598-025-15404-y |