Accurate Underwater ATR in Forward-Looking Sonar Imagery Using Deep Convolutional Neural Networks
Underwater automatic target recognition (ATR) is a challenging task for marine robots due to the complex environment. The existing recognition methods basically use hand-crafted features and classifiers to recognize targets, which are difficult to achieve ideal recognition accuracy. In this paper, w...
Saved in:
Published in | IEEE access Vol. 7; pp. 125522 - 125531 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Underwater automatic target recognition (ATR) is a challenging task for marine robots due to the complex environment. The existing recognition methods basically use hand-crafted features and classifiers to recognize targets, which are difficult to achieve ideal recognition accuracy. In this paper, we proposed a novel method to realize accurate multiclass underwater ATR by using forward-looking sonar-Echoscope and deep convolutional neural networks (DCNNs). A complete recognition process from data preprocessing to network training and image recognition was realized. Firstly, we established a real, measured Echoscope sonar image dataset. Inspired by the human visual attention mechanism, the suspected target region was extracted via the graph-based manifold ranking method in image preprocessing. Secondly, an end-to-end DCNNs model, named EchoNet, was designed for Echoscope sonar image feature extraction and recognition. Finally, a network training method based on transfer learning was developed to solve the problem of insufficient training data, and mini-batch gradient descent was used for network optimization. Experimental results demonstrated that our method can implement efficiently, and the recognition accuracy on a nine-class underwater ATR task reached 97.3%, outperforming traditional feature-based methods. The proposed method is expected to be a potential novel technology for the intelligent perception of autonomous underwater vehicles. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2939005 |