Siamese reciprocal classification and residual regression for robust object tracking
Recently, Siamese trackers have received more attention in visual tracking due to their satisfactory balance between performance and efficiency. However, most Siamese trackers neglect the misalignment between classification confidence and regression accuracy, which limits their best performance. In...
Saved in:
Published in | Digital signal processing Vol. 123; p. 103451 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Inc
30.04.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Recently, Siamese trackers have received more attention in visual tracking due to their satisfactory balance between performance and efficiency. However, most Siamese trackers neglect the misalignment between classification confidence and regression accuracy, which limits their best performance. In this paper, we proposed a novel reciprocal localization-aware classification and residual regression architecture with Siamese network named SiamRCRR. The proposed SiamRCRR consists of three subnetworks: a residual regression subnetwork to fully exploit the accuracy of object localization through the refinement of the bounding box, a localization-aware classification subnetwork to predict comprehensive scores of classification confidence and regression accuracy, and a Siamese subnetwork to extract features for classification and regression. The interaction of the residual regression subnetwork and localization-aware classification subnetwork forms a closed-loop structure to take advantage of the mutual benefit between classification and regression tasks. Therefore, the consistency between classification confidence and regression accuracy is guaranteed during tracking. Experiments on five challenging benchmarks including GOT-10k, OTB-100, LaSOT, VOT2019, and NFS show that SiamRCRR significantly outperforms its well-behaved counterparts in tracking accuracy and execution effectiveness. |
---|---|
ISSN: | 1051-2004 1095-4333 |
DOI: | 10.1016/j.dsp.2022.103451 |