Enhancing cross-domain detection: adaptive class-aware contrastive transformer

Recently,the detection transformer has gained substantial attention for its inherent minimal post-processing requirement.However,this paradigm relies on abundant training data,yet in the context of the cross-domain adaptation,insufficient labels in the target domain exacerbate issues of class imbala...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Zeng, Ziru, Ding, Yue, Lu, Hongtao
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 24.01.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently,the detection transformer has gained substantial attention for its inherent minimal post-processing requirement.However,this paradigm relies on abundant training data,yet in the context of the cross-domain adaptation,insufficient labels in the target domain exacerbate issues of class imbalance and model performance degradation.To address these challenges, we propose a novel class-aware cross domain detection transformer based on the adversarial learning and mean-teacher framework.First,considering the inconsistencies between the classification and regression tasks,we introduce an IoU-aware prediction branch and exploit the consistency of classification and location scores to filter and reweight pseudo labels.Second, we devise a dynamic category threshold refinement to adaptively manage model confidence.Third,to alleviate the class imbalance,an instance-level class-aware contrastive learning module is presented to encourage the generation of discriminative features for each class,particularly benefiting minority classes.Experimental results across diverse domain-adaptive scenarios validate our method's effectiveness in improving performance and alleviating class imbalance issues,which outperforms the state-of-the-art transformer based methods.
ISSN:2331-8422