Oil Spill Identification based on Dual Attention UNet Model Using Synthetic Aperture Radar Images
Oil spills cause tremendous damage to marine, coastal environments, and ecosystems. Previous deep learning-based studies have addressed the task of detecting oil spills as a semantic segmentation problem. However, further improvement is still required to address the noisy nature of the Synthetic Ape...
Saved in:
Published in | Journal of the Indian Society of Remote Sensing Vol. 51; no. 1; pp. 121 - 133 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New Delhi
Springer India
01.01.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Oil spills cause tremendous damage to marine, coastal environments, and ecosystems. Previous deep learning-based studies have addressed the task of detecting oil spills as a semantic segmentation problem. However, further improvement is still required to address the noisy nature of the Synthetic Aperture Radar (SAR) imagery problem, which limits segmentation performance. In this study, a new deep learning model based on the Dual Attention Model (DAM) is developed to automatically detect oil spills in a water body. We enhanced a conventional UNet segmentation network by integrating a dual attention model DAM to selectively highlight the relevant and discriminative global and local characteristics of oil spills in SAR imagery. DAM is composed of a Channel Attention Map and a Position Attention Map which are stacked in the decoder network of UNet. The proposed DAM-UNet is compared with four baselines, namely fully convolutional network, PSPNet, LinkNet, and traditional UNet. The proposed DAM-UNet outperforms the four baselines, as demonstrated empirically. Moreover, the EG-Oil Spill dataset includes a large set of SAR images with 3000 image pairs. The obtained overall accuracy of the proposed method increased by 3.2% and reaches 94.2% compared with that of the traditional UNet. The study opens new development ideas for integrating attention modules into other deep learning tasks, including machine translation, image-based analysis, action recognition, and speech recognition. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 0255-660X 0974-3006 |
DOI: | 10.1007/s12524-022-01624-6 |