Robust Contrastive Active Learning with Feature-guided Query Strategies
We introduce supervised contrastive active learning (SCAL) and propose efficient query strategies in active learning based on the feature similarity (featuresim) and principal component analysis based feature-reconstruction error (fre) to select informative data samples with diverse feature represen...
Saved in:
Main Authors | , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
13.09.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We introduce supervised contrastive active learning (SCAL) and propose
efficient query strategies in active learning based on the feature similarity
(featuresim) and principal component analysis based feature-reconstruction
error (fre) to select informative data samples with diverse feature
representations. We demonstrate our proposed method achieves state-of-the-art
accuracy, model calibration and reduces sampling bias in an active learning
setup for balanced and imbalanced datasets on image classification tasks. We
also evaluate robustness of model to distributional shift derived from
different query strategies in active learning setting. Using extensive
experiments, we show that our proposed approach outperforms high performing
compute-intensive methods by a big margin resulting in 9.9% lower mean
corruption error, 7.2% lower expected calibration error under dataset shift and
8.9% higher AUROC for out-of-distribution detection. |
---|---|
DOI: | 10.48550/arxiv.2109.06873 |