Ground estimation and point cloud segmentation using SpatioTemporal Conditional Random Field
Whether it be to feed data for an object detection-and-tracking system or to generate proper occupancy grids, 3D point cloud extraction of the ground and data classification are critical processing tasks, on their efficiency can drastically depend the whole perception chain. Flat-ground assumption o...
Saved in:
Published in | 2017 IEEE Intelligent Vehicles Symposium (IV) pp. 1105 - 1110 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.06.2017
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Whether it be to feed data for an object detection-and-tracking system or to generate proper occupancy grids, 3D point cloud extraction of the ground and data classification are critical processing tasks, on their efficiency can drastically depend the whole perception chain. Flat-ground assumption or form recognition in point clouds can either lead to systematic error, or massive calculations. This paper describes an adaptive method for ground labeling in 3D Point clouds, based on a local ground elevation estimation. The system proposes to model the ground as a Spatio-Temporal Conditional Random Field (STCRF). Spatial and temporal dependencies within the segmentation process are unified by a dynamic probabilistic framework based on the conditional random field (CRF). Ground elevation parameters are estimated in parallel in each node, using an interconnected Expectation Maximization (EM) algorithm variant. The approach, designed to target high-speed vehicle constraints and performs efficiently with highly-dense (Velodyne-64) and sparser (Ibeo-Lux) 3D point clouds, has been implemented and deployed on experimental vehicle and platforms, and are currently tested on embedded systems (Nvidia Jetson TX1, TK1). The experiments on real road data, in various situations (city, countryside, mountain roads,...), show promising results. |
---|---|
DOI: | 10.1109/IVS.2017.7995861 |