Robust fine-grained image classification with noisy labels
Since annotating fine-grained labels requires special expertise, label annotations often lack quality for many real-world fine-grained image classifications (FGIC). Due to the effectiveness of noisy labels, training deep fine-grained models directly tends to have an inferior recognition ability. To...
Saved in:
Published in | The Visual computer Vol. 39; no. 11; pp. 5637 - 5650 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.11.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Since annotating fine-grained labels requires special expertise, label annotations often lack quality for many real-world fine-grained image classifications (FGIC). Due to the effectiveness of noisy labels, training deep fine-grained models directly tends to have an inferior recognition ability. To address this problem in FGIC, a robust classification approach combining “active–passive–loss (APL)” framework and multi-branch attention learning is proposed. First, in order to learn discriminative regions for classification effectively, the multi-branch attention learning framework that consists of raw, object, and part branch is introduced. These three branches are connected by attention mechanism, which enables the network to learn fine-grained features of different parts from different scales including raw, object and part levels. Second, treating noisy labels as anomalies, the novel loss framework APL that can guarantee robustness and sufficient learning is adopted to achieve robust predictions in each branch. Third, in determining the final predictions, the outputs from global and object branches are combined to achieve higher classification performance. Several experiments on fine-grained image datasets show that the proposed approach is noise-robust and can achieve excellent classification performance in the presence of noisy labels in FGIC. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0178-2789 1432-2315 |
DOI: | 10.1007/s00371-022-02686-w |