Pedestrian trajectory prediction with convolutional neural networks

•New convolutional model achieves state-of-the-art results on ETH and TrajNet datasets.•Random rotations and Gaussian noise are the best data augmentation techniques.•Coordinates with the origin in the last observation point better represent trajectory. Predicting the future trajectories of pedestri...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition Vol. 121; p. 108252
Main Authors Zamboni, Simone, Kefato, Zekarias Tilahun, Girdzijauskas, Sarunas, Norén, Christoffer, Dal Col, Laura
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.01.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•New convolutional model achieves state-of-the-art results on ETH and TrajNet datasets.•Random rotations and Gaussian noise are the best data augmentation techniques.•Coordinates with the origin in the last observation point better represent trajectory. Predicting the future trajectories of pedestrians is a challenging problem that has a range of application, from crowd surveillance to autonomous driving. In literature, methods to approach pedestrian trajectory prediction have evolved, transitioning from physics-based models to data-driven models based on recurrent neural networks. In this work, we propose a new approach to pedestrian trajectory prediction, with the introduction of a novel 2D convolutional model. This new model outperforms recurrent models, and it achieves state-of-the-art results on the ETH and TrajNet datasets. We also present an effective system to represent pedestrian positions and powerful data augmentation techniques, such as the addition of Gaussian noise and the use of random rotations, which can be applied to any model. As an additional exploratory analysis, we present experimental results on the inclusion of occupancy methods to model social information, which empirically show that these methods are ineffective in capturing social interaction.
ISSN:0031-3203
1873-5142
1873-5142
DOI:10.1016/j.patcog.2021.108252