Video-Based Person Re-Identification by Simultaneously Learning Intra-Video and Inter-Video Distance Metrics

Video-based person re-identification (re-id) is an important application in practice. Since large variations exist between different pedestrian videos, as well as within each video, it is challenging to conduct re-id between the pedestrian videos. In this paper, we propose a simultaneous intra-video...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 27; no. 11; pp. 5683 - 5695
Main Authors Zhu, Xiaoke, Jing, Xiao-Yuan, You, Xinge, Zhang, Xinyu, Zhang, Taiping
Format Journal Article
LanguageEnglish
Published United States IEEE 01.11.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Video-based person re-identification (re-id) is an important application in practice. Since large variations exist between different pedestrian videos, as well as within each video, it is challenging to conduct re-id between the pedestrian videos. In this paper, we propose a simultaneous intra-video and inter-video distance learning (SI 2 DL) approach for the video-based person re-id. Specifically, SI 2 DL simultaneously learns an intra-video distance metric and an inter-video distance metric from the training videos. The intra-video distance metric is used to make each video more compact, and the inter-video one is used to ensure that the distance between truly matching videos is smaller than that between wrong matching videos. Considering that the goal of distance learning is to make truly matching video pairs from different persons be well separated with each other, we also propose a pair separation-based SI 2 DL (P-SI 2 DL). P-SI 2 DL aims to learn a pair of distance metrics, under which any two truly matching video pairs can be well separated. Experiments on four public pedestrian image sequence data sets show that our approaches achieve the state-of-the-art performance.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2018.2861366