Crots: Cross-Domain Teacher–Student Learning for Source-Free Domain Adaptive Semantic Segmentation

Source-free domain adaptation (SFDA) aims to transfer source knowledge to the target domain from pre-trained source models without accessing private source data. Existing SFDA methods typically adopt the self-training strategy employing the pre-trained source model to generate pseudo-labels for unla...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of computer vision Vol. 132; no. 1; pp. 20 - 39
Main Authors Luo, Xin, Chen, Wei, Liang, Zhengfa, Yang, Longqi, Wang, Siwei, Li, Chen
Format Journal Article
LanguageEnglish
Published New York Springer US 01.01.2024
Springer
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Source-free domain adaptation (SFDA) aims to transfer source knowledge to the target domain from pre-trained source models without accessing private source data. Existing SFDA methods typically adopt the self-training strategy employing the pre-trained source model to generate pseudo-labels for unlabeled target data. However, these methods are subject to strict limitations: (1) The discrepancy between source and target domains results in intense noise and unreliable pseudo-labels. Overfitting noisy pseudo-labeled target data will lead to drastic performance degradation. (2) Considering the class-imbalanced pseudo-labels, the target model is prone to forget the minority classes. Aiming at these two limitations, this study proposes a CROss domain Teacher–Student learning framework (namely CROTS) to achieve source-free domain adaptive semantic segmentation. Specifically, with pseudo-labels provided by the intra-domain teacher model, CROTS incorporates Spatial-Aware Data Mixing to generate diverse samples by randomly mixing different patches respecting to their spatial semantic layouts. Meanwhile, during inter-domain teacher–student learning, CROTS fosters Rare-Class Patches Mining strategy to mitigate the class imbalance phenomenon. To this end, the inter-domain teacher model helps exploit long-tailed rare classes and promote their contributions to student learning. Extensive experimental results have demonstrated that: (1) CROTS mitigates the overfitting issue and contributes to stable performance improvement, i.e., + 16.0% mIoU and + 16.5% mIoU for SFDA in GTA5 → Cityscapes and SYNTHIA → Cityscapes, respectively; (2) CROTS improves task performance for long-tailed rare classes, alleviating the issue of class imbalance; (3) CROTS achieves superior performance comparing to other SFDA competitors; (4) CROTS can be applied under the black-box SFDA setting, even outperforming many white-box SFDA methods. Our codes will be publicly available at https://github.com/luoxin13/CROTS .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0920-5691
1573-1405
DOI:10.1007/s11263-023-01863-1