Deep Learning-based DSM Generation from Dual-Aspect SAR Data

Rapid mapping demands efficient methods for a fast extraction of information from satellite data while minimizing data requirements. This paper explores the potential of deep learning for the generation of high-resolution urban elevation data from Synthetic Aperture Radar (SAR) imagery. In order to...

Full description

Saved in:
Bibliographic Details
Published inISPRS annals of the photogrammetry, remote sensing and spatial information sciences Vol. X-2-2024; pp. 193 - 200
Main Authors Recla, Michael, Schmitt, Michael
Format Journal Article
LanguageEnglish
Published Gottingen Copernicus GmbH 10.06.2024
Copernicus Publications
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Rapid mapping demands efficient methods for a fast extraction of information from satellite data while minimizing data requirements. This paper explores the potential of deep learning for the generation of high-resolution urban elevation data from Synthetic Aperture Radar (SAR) imagery. In order to mitigate occlusion effects caused by the side-looking nature of SAR remote sensing, two SAR images from opposing aspects are leveraged and processed in an end-to-end deep neural network. The presented approach is the first of its kind to implicitly handle the transition from the SAR-specific slant range geometry to a ground-based mapping geometry within the model architecture. Comparative experiments demonstrate the superiority of the dual-aspect fusion over single-image methods in terms of reconstruction quality and geolocation accuracy. Notably, the model exhibits robust performance across diverse acquisition modes and geometries, showcasing its generalizability and suitability for height mapping applications. The study’s findings underscore the potential of deep learning-driven SAR techniques in generating high-quality urban surface models efficiently and economically.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2194-9050
2194-9042
2194-9050
DOI:10.5194/isprs-annals-X-2-2024-193-2024