Towards Grand Unification of Object Tracking

We present a unified method, termed Unicorn, that can simultaneously solve four tracking problems (SOT, MOT, VOS, MOTS) with a single network using the same model parameters. Due to the fragmented definitions of the object tracking problem itself, most existing trackers are developed to address a si...

Full description

Saved in:
Bibliographic Details
Published inComputer Vision - ECCV 2022 Vol. 13681; pp. 733 - 751
Main Authors Yan, Bin, Jiang, Yi, Sun, Peize, Wang, Dong, Yuan, Zehuan, Luo, Ping, Lu, Huchuan
Format Book Chapter
LanguageEnglish
Published Switzerland Springer 01.01.2022
Springer Nature Switzerland
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783031198021
3031198026
ISSN0302-9743
1611-3349
DOI10.1007/978-3-031-19803-8_43

Cover

Loading…
More Information
Summary:We present a unified method, termed Unicorn, that can simultaneously solve four tracking problems (SOT, MOT, VOS, MOTS) with a single network using the same model parameters. Due to the fragmented definitions of the object tracking problem itself, most existing trackers are developed to address a single or part of tasks and over-specialize on the characteristics of specific tasks. By contrast, Unicorn provides a unified solution, adopting the same input, backbone, embedding, and head across all tracking tasks. For the first time, we accomplish the great unification of the tracking network architecture and learning paradigm. Unicorn performs on-par or better than its task-specific counterparts in 8 tracking datasets, including LaSOT, TrackingNet, MOT17, BDD100K, DAVIS16-17, MOTS20, and BDD100K MOTS. We believe that Unicorn will serve as a solid step towards the general vision model. Code is available at https://github.com/MasterBin-IIAU/Unicorn.
Bibliography:Supplementary InformationThe online version contains supplementary material available at https://doi.org/10.1007/978-3-031-19803-8_43.
B. Yan—This work was performed while Bin Yan worked as an intern at ByteDance.
ISBN:9783031198021
3031198026
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-031-19803-8_43