MCDAL: Maximum Classifier Discrepancy for Active Learning
Recent state-of-the-art active learning methods have mostly leveraged generative adversarial networks (GANs) for sample acquisition; however, GAN is usually known to suffer from instability and sensitivity to hyperparameters. In contrast to these methods, in this article, we propose a novel active l...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 34; no. 11; pp. 8753 - 8763 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.11.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Recent state-of-the-art active learning methods have mostly leveraged generative adversarial networks (GANs) for sample acquisition; however, GAN is usually known to suffer from instability and sensitivity to hyperparameters. In contrast to these methods, in this article, we propose a novel active learning framework that we call Maximum Classifier Discrepancy for Active Learning (MCDAL) that takes the prediction discrepancies between multiple classifiers. In particular, we utilize two auxiliary classification layers that learn tighter decision boundaries by maximizing the discrepancies among them. Intuitively, the discrepancies in the auxiliary classification layers' predictions indicate the uncertainty in the prediction. In this regard, we propose a novel method to leverage the classifier discrepancies for the acquisition function for active learning. We also provide an interpretation of our idea in relation to existing GAN-based active learning methods and domain adaptation frameworks. Moreover, we empirically demonstrate the utility of our approach where the performance of our approach exceeds the state-of-the-art methods on several image classification and semantic segmentation datasets in active learning setups. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 2162-237X 2162-2388 2162-2388 |
DOI: | 10.1109/TNNLS.2022.3152786 |