Simultaneous extraction of roads and buildings in remote sensing imagery with convolutional neural networks

Extraction of man-made objects (e.g., roads and buildings) from remotely sensed imagery plays an important role in many urban applications (e.g., urban land use and land cover assessment, updating geographical databases, change detection, etc). This task is normally difficult due to complex data in...

Full description

Saved in:
Bibliographic Details
Published inISPRS journal of photogrammetry and remote sensing Vol. 130; pp. 139 - 149
Main Authors Alshehhi, Rasha, Marpu, Prashanth Reddy, Woon, Wei Lee, Mura, Mauro Dalla
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.08.2017
Elsevier
Subjects
Online AccessGet full text
ISSN0924-2716
1872-8235
DOI10.1016/j.isprsjprs.2017.05.002

Cover

Loading…
More Information
Summary:Extraction of man-made objects (e.g., roads and buildings) from remotely sensed imagery plays an important role in many urban applications (e.g., urban land use and land cover assessment, updating geographical databases, change detection, etc). This task is normally difficult due to complex data in the form of heterogeneous appearance with large intra-class and lower inter-class variations. In this work, we propose a single patch-based Convolutional Neural Network (CNN) architecture for extraction of roads and buildings from high-resolution remote sensing data. Low-level features of roads and buildings (e.g., asymmetry and compactness) of adjacent regions are integrated with Convolutional Neural Network (CNN) features during the post-processing stage to improve the performance. Experiments are conducted on two challenging datasets of high-resolution images to demonstrate the performance of the proposed network architecture and the results are compared with other patch-based network architectures. The results demonstrate the validity and superior performance of the proposed network architecture for extracting roads and buildings in urban areas.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0924-2716
1872-8235
DOI:10.1016/j.isprsjprs.2017.05.002