Sample Balancing for Deep Learning-Based Visual Recognition

Sample balancing includes sample selection and sample reweighting. Sample selection aims to remove some bad samples that may lead to bad local optima. Sample reweighting aims to assign optimal weights to samples to improve performance. In this article, we integrate a sample selection method based on...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 31; no. 10; pp. 3962 - 3976
Main Authors Chen, Xin, Weng, Jian, Luo, Weiqi, Lu, Wei, Wu, Huimin, Xu, Jiaming, Tian, Qi
Format Journal Article
LanguageEnglish
Published United States IEEE 01.10.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Sample balancing includes sample selection and sample reweighting. Sample selection aims to remove some bad samples that may lead to bad local optima. Sample reweighting aims to assign optimal weights to samples to improve performance. In this article, we integrate a sample selection method based on self-paced learning into deep learning frameworks and study the influence of different sample selection strategies on training deep networks. In addition, most of the existing sample reweighting methods mainly take per-class sample number as a metric, which does not fully consider sample qualities. To improve the performance, we propose a novel metric based on the multiview semantic encoders to reweight the samples more appropriately. Then, we propose an optimization mechanism to embed sample weights into loss functions of deep networks, which can be trained in end-to-end manners. We conduct experiments on the CIFAR data set and the ImageNet data set. The experimental results demonstrate that our proposed sample balancing method can improve the performances of deep learning methods in several visual recognition tasks.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2019.2947789