Towards better exploiting convolutional neural networks for remote sensing scene classification

We present an analysis of three possible strategies for exploiting the power of existing convolutional neural networks (ConvNets or CNNs) in different scenarios from the ones they were trained: full training, fine tuning, and using ConvNets as feature extractors. In many applications, especially inc...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition Vol. 61; pp. 539 - 556
Main Authors Nogueira, Keiller, Penatti, Otávio A.B., dos Santos, Jefersson A.
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.01.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present an analysis of three possible strategies for exploiting the power of existing convolutional neural networks (ConvNets or CNNs) in different scenarios from the ones they were trained: full training, fine tuning, and using ConvNets as feature extractors. In many applications, especially including remote sensing, it is not feasible to fully design and train a new ConvNet, as this usually requires a considerable amount of labeled data and demands high computational costs. Therefore, it is important to understand how to better use existing ConvNets. We perform experiments with six popular ConvNets using three remote sensing datasets. We also compare ConvNets in each strategy with existing descriptors and with state-of-the-art baselines. Results point that fine tuning tends to be the best performing strategy. In fact, using the features from the fine-tuned ConvNet with linear SVM obtains the best results. We also achieved state-of-the-art results for the three datasets used. •Analysis of the generalization power of ConvNets for remote sensing datasets.•Comparative analysis of ConvNets and low-level and mid-level feature descriptors.•Evaluation and analysis of three strategies to exploit existing ConvNets in different scenarios.•Evaluation of ConvNets with state-of-the-art baselines.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2016.07.001