Domain diversity based meta learning for continual person re-identification

Continual person re-identification (CReID) aims to learn a unified model to cope with changing scenarios (e.g., from malls to streets and stations, etc). However, deploying the CReID model directly on unseen scenarios (which cannot be foreseen) outside the continual pipeline leads to a drop in gener...

Full description

Saved in:
Bibliographic Details
Published inPattern analysis and applications : PAA Vol. 28; no. 3
Main Authors Liu, Zhaoshuo, Feng, Chaolu, Yu, Kun, Song, Jiangdian, Li, Wei
Format Journal Article
LanguageEnglish
Published London Springer London 01.09.2025
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Continual person re-identification (CReID) aims to learn a unified model to cope with changing scenarios (e.g., from malls to streets and stations, etc). However, deploying the CReID model directly on unseen scenarios (which cannot be foreseen) outside the continual pipeline leads to a drop in generalization performance due to distribution shifts. In this paper, we design a meta learning paradigm with the seen and unseen domains to adapt the model to distribution shifts, where the unseen domains are produced by our carefully designed Domain Diversity (DD). Considering the knowledge encompassing both seen and unseen domains can fight against forgetting as well as improve generalization, we accumulate the learned and future knowledge corresponding to seen and unseen domains respectively through graph attention networks. Subsequently, we integrate the accumulated knowledge into meta learning steps to guide the training of the model, ensuring less forgetting and better generalization. Extensive experiments conducted on twelve datasets demonstrate the effectiveness of our method with superior performance in generalization and anti-forgetting. The code is available at https://github.com/DFLAG-NEU/MetaDD .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-025-01502-0