A deep learning approach for polyline and building simplification based on graph autoencoder with flexible constraints

Polyline and building simplification remain challenging in cartography. Most proposed algorithms are geometric-based and rely on specific rules. In this study, we propose a deep learning approach to simplify polylines and buildings based on a graph autoencoder (GAE). The model receives the coordinat...

Full description

Saved in:
Bibliographic Details
Published inCartography and geographic information science Vol. 51; no. 1; pp. 79 - 96
Main Authors Yan, Xiongfeng, Yang, Min
Format Journal Article
LanguageEnglish
Published Taylor & Francis 02.01.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Polyline and building simplification remain challenging in cartography. Most proposed algorithms are geometric-based and rely on specific rules. In this study, we propose a deep learning approach to simplify polylines and buildings based on a graph autoencoder (GAE). The model receives the coordinates of line vertices as inputs and obtains a simplified representation by reconstructing the original inputs with fewer vertices through pooling, in which the graph convolution based on graph Fourier transform is used for the layer-by-layer feature computation. By adjusting the loss functions, constraints such as area and shape preservation and angle-characteristic enhancement are flexibly configured under a unified learning framework. Our results confirmed the applicability of the GAE approach to the multi-scale simplification of land-cover boundaries and contours by adjusting the number of output nodes. Compared with existing Douglas‒Peukcer, Fourier transform, and Delaunay triangulation approaches, the GAE approach was superior in achieving morphological abstraction while producing reasonably low position, area, and shape changes. Furthermore, we applied it to simplify buildings and demonstrated the potential for preserving the diversified characteristics of different types of lines.
ISSN:1523-0406
1545-0465
DOI:10.1080/15230406.2023.2218106