Countering Noisy Labels By Learning From Auxiliary Clean Labels

We consider the learning from noisy labels (NL) problem which emerges in many real-world applications. In addition to the widely-studied synthetic noise in the NL literature, we also consider the pseudo labels in semi-supervised learning (Semi-SL) as a special case of NL. For both types of noise, we...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Tsung Wei Tsai, Li, Chongxuan, Zhu, Jun
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 12.09.2019
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We consider the learning from noisy labels (NL) problem which emerges in many real-world applications. In addition to the widely-studied synthetic noise in the NL literature, we also consider the pseudo labels in semi-supervised learning (Semi-SL) as a special case of NL. For both types of noise, we argue that the generalization performance of existing methods is highly coupled with the quality of noisy labels. Therefore, we counter the problem from a novel and unified perspective: learning from the auxiliary clean labels. Specifically, we propose the Rotational-Decoupling Consistency Regularization (RDCR) framework that integrates the consistency-based methods with the self-supervised rotation task to learn noise-tolerant representations. The experiments show that RDCR achieves comparable or superior performance than the state-of-the-art methods under small noise, while outperforms the existing methods significantly when there is large noise.
ISSN:2331-8422