An Active Deep Learning Approach for Minimally Supervised PolSAR Image Classification

Recently, deep neural networks have received intense interests in polarimetric synthetic aperture radar (PolSAR) image classification. However, its success is subject to the availability of large amounts of annotated data which require great efforts of experienced human annotators. Aiming at improvi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 57; no. 11; pp. 9378 - 9395
Main Authors Bi, Haixia, Xu, Feng, Wei, Zhiqiang, Xue, Yong, Xu, Zongben
Format Journal Article
LanguageEnglish
Published New York IEEE 01.11.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently, deep neural networks have received intense interests in polarimetric synthetic aperture radar (PolSAR) image classification. However, its success is subject to the availability of large amounts of annotated data which require great efforts of experienced human annotators. Aiming at improving the classification performance with greatly reduced annotation cost, this paper presents an active deep learning approach for minimally supervised PolSAR image classification, which integrates active learning and fine-tuned convolutional neural network (CNN) into a principled framework. Starting from a CNN trained using a very limited number of labeled pixels, we iteratively and actively select the most informative candidates for annotation, and incrementally fine-tune the CNN by incorporating the newly annotated pixels. Moreover, to boost the performance and robustness of the proposed method, we employ Markov random field (MRF) to enforce class label smoothness, and data augmentation technique to enlarge the training set. We conducted extensive experiments on four real benchmark PolSAR images, and experiments demonstrated that our approach achieved state-of-the-art classification results with significantly reduced annotation cost.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2019.2926434