TD-Road: Top-Down Road Network Extraction with Holistic Graph Construction

Graph-based approaches have been becoming increasingly popular in road network extraction, in addition to segmentation-based methods. Road networks are represented as graph structures, being able to explicitly define the topology structures and avoid the ambiguity of segmentation masks, such as betw...

Full description

Saved in:
Bibliographic Details
Published inComputer Vision – ECCV 2022 pp. 562 - 577
Main Authors He, Yang, Garg, Ravi, Chowdhury, Amber Roy
Format Book Chapter
LanguageEnglish
Published Cham Springer Nature Switzerland 06.11.2022
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Graph-based approaches have been becoming increasingly popular in road network extraction, in addition to segmentation-based methods. Road networks are represented as graph structures, being able to explicitly define the topology structures and avoid the ambiguity of segmentation masks, such as between a real junction area and multiple separate roads in different heights. In contrast to the bottom-up graph-based approaches, which rely on orientation information, we propose a novel top-down approach to generate road network graphs with a holistic model, namely TD-Road. We decompose road extraction as two subtasks: key point prediction and connectedness prediction. We directly apply graph structures (i.e., locations of node and connections between them) as training supervisions for neural networks and generate road graph outputs in inference, instead of learning some intermediate properties of a graph structure (e.g., orientations or distances for the next move). Our network integrates a relation inference module with key point prediction, to capture connections between neighboring points and outputs the final road graphs with no post-processing steps required. Extensive experiments are conducted on challenging datasets, including City-Scale and SpaceNet to show the effectiveness and simplicity of our method, that the proposed method achieves remarkable results compared with previous state-of-the-art methods.
ISBN:9783031200762
3031200764
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-031-20077-9_33