A new multidimensional discriminant representation for robust person re-identification

Person Re-Identification (PRe-ID) or person retrieval is a challenging task of computer vision, aiming to identify a specific person across disjoint cameras distributed over different locations. Designing discriminant features parts as well as learning distance metrics are critical issues for improv...

Full description

Saved in:
Bibliographic Details
Published inPattern analysis and applications : PAA Vol. 26; no. 3; pp. 1191 - 1204
Main Authors Chouchane, Ammar, Bessaoudi, Mohcene, Boutellaa, Elhocine, Ouamane, Abdelmalik
Format Journal Article
LanguageEnglish
Published London Springer London 01.08.2023
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Person Re-Identification (PRe-ID) or person retrieval is a challenging task of computer vision, aiming to identify a specific person across disjoint cameras distributed over different locations. Designing discriminant features parts as well as learning distance metrics are critical issues for improving the performances of the PRe-ID system. To deal with these critical problems, this paper proposes a new semi-supervised subspace approach named Multilinear Cross-view Quadratic Discriminant Analysis based on Cholesky decomposition (MXQDA-CD). In which, a new multidimensional discriminant representation is designed to increase the discrimination between different persons using third order tensor data that combines several features parts. Since the matching process yields heterogeneous scores, resulting from subjects captured through multiple cameras under different conditions, score normalization is applied to map these scores into a common space which led to improved performances of our approach. Experimental results achieved on four challenging person re-identification datasets, namely, PRID450S , CUHK01, GRID and VIPeR, show high competitiveness of the proposed method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-023-01144-0