Online Sequence Clustering Algorithm for Video Trajectory Analysis

Target tracking and trajectory modeling have important applications in surveillance video analysis and have received great attention in the fields of road safety and community security. In this work, we propose a lightweight real-time video analysis scheme that uses a model learned from motion patte...

Full description

Saved in:
Bibliographic Details
Main Authors Yuemaier, Aximu, Chen, Xiaogang, Qian, Xingyu, Liang, Longfei, Li, Shunfeng, Song, Zhitang
Format Journal Article
LanguageEnglish
Published 15.05.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Target tracking and trajectory modeling have important applications in surveillance video analysis and have received great attention in the fields of road safety and community security. In this work, we propose a lightweight real-time video analysis scheme that uses a model learned from motion patterns to monitor the behavior of objects, which can be used for applications such as real-time representation and prediction. The proposed sequence clustering algorithm based on discrete sequences makes the system have continuous online learning ability. The intrinsic repeatability of the target object trajectory is used to automatically construct the behavioral model in the three processes of feature extraction, cluster learning, and model application. In addition to the discretization of trajectory features and simple model applications, this paper focuses on online clustering algorithms and their incremental learning processes. Finally, through the learning of the trajectory model of the actual surveillance video image, the feasibility of the algorithm is verified. And the characteristics and performance of the clustering algorithm are discussed in the analysis. This scheme has real-time online learning and processing of motion models while avoiding a large number of arithmetic operations, which is more in line with the application scenarios of front-end intelligent perception.
DOI:10.48550/arxiv.2305.08418