Equalization Loss v2: A New Gradient Balance Approach for Long-tailed Object Detection

Recently proposed decoupled training methods emerge as a dominant paradigm for long-tailed object detection. But they require an extra fine-tuning stage, and the dis-jointed optimization of representation and classifier might lead to suboptimal results. However, end-to-end training methods, like equ...

Full description

Saved in:
Bibliographic Details
Published in2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) pp. 1685 - 1694
Main Authors Tan, Jingru, Lu, Xin, Zhang, Gang, Yin, Changqing, Li, Quanquan
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently proposed decoupled training methods emerge as a dominant paradigm for long-tailed object detection. But they require an extra fine-tuning stage, and the dis-jointed optimization of representation and classifier might lead to suboptimal results. However, end-to-end training methods, like equalization loss (EQL), still perform worse than decoupled training methods. In this paper, we re-veal the main issue in long-tailed object detection is the imbalanced gradients between positives and negatives, and find that EQL does not solve it well. To address the problem of imbalanced gradients, we introduce a new version of equalization loss, called equalization loss v2 (EQL v2), a novel gradient guided reweighing mechanism that re-balances the training process for each category independently and equally. Extensive experiments are performed on the challenging LVIS benchmark. EQL v2 outperforms origin EQL by about 4 points overall AP with 14 ∼ 18 points improvements on the rare categories. More importantly, it also surpasses decoupled training methods. With-out further tuning for the Open Images dataset, EQL v2 improves EQL by 7.3 points AP, showing strong generalization ability. Codes have been released at https://github.com/tztztztztz/eqlv2
ISSN:2575-7075
DOI:10.1109/CVPR46437.2021.00173