Online Multi-Object Tracking with United Siamese Network and Candidate-Refreshing Model
Current mainstream multi-object tracking (MOT) algorithms aim to maintain the identities of targets by data-association. However, the accuracy of multi-object tracking would be affected by the unreliability results of detectors. Simultaneously, the separation of motion module and appearance module c...
Saved in:
Published in | 2021 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 8 |
---|---|
Main Authors | , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
18.07.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Current mainstream multi-object tracking (MOT) algorithms aim to maintain the identities of targets by data-association. However, the accuracy of multi-object tracking would be affected by the unreliability results of detectors. Simultaneously, the separation of motion module and appearance module cannot fully exploit affinity features that will increase computational complexity. To address these issues, we propose a multi-task tracking framework including the United Siamese Network (USN) and Candidate-Refreshing (CR) model. The USN integrates motion affinity and appearance affinity into an end-to-end network. Such design can pay attention to joint learning of similarities between motion patterns and appearance features and enhance the ability to distinguish similar targets in complex environments. The CR model aims to combine detection candidates with tracking candidates, and compensate for unreliable detection results by scoring and regressing the bounding boxes. In addition, we equip our framework with a tracklet confidence function, which is based on the spatialtemporal information. This function can determine the vitality of unmatched tracklets and further alleviate the influence of detectors. Experiments on the MOT16 and MOT17 benchmark datasets show that our framework has achieved outstanding performance on online MOT algorithms. |
---|---|
ISSN: | 2161-4407 |
DOI: | 10.1109/IJCNN52387.2021.9533479 |