Learning from Imbalanced Data Sets with Weighted Cross-Entropy Function

This paper presents a novel approach to deal with the imbalanced data set problem in neural networks by incorporating prior probabilities into a cost-sensitive cross-entropy error function. Several classical benchmarks were tested for performance evaluation using different metrics, namely G-Mean, ar...

Full description

Saved in:
Bibliographic Details
Published inNeural processing letters Vol. 50; no. 2; pp. 1937 - 1949
Main Authors Aurelio, Yuri Sousa, de Almeida, Gustavo Matheus, de Castro, Cristiano Leite, Braga, Antonio Padua
Format Journal Article
LanguageEnglish
Published New York Springer US 01.10.2019
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents a novel approach to deal with the imbalanced data set problem in neural networks by incorporating prior probabilities into a cost-sensitive cross-entropy error function. Several classical benchmarks were tested for performance evaluation using different metrics, namely G-Mean, area under the ROC curve (AUC), adjusted G-Mean, Accuracy, True Positive Rate, True Negative Rate and F1-score. The obtained results were compared to well-known algorithms and showed the effectiveness and robustness of the proposed approach, which results in well-balanced classifiers given different imbalance scenarios.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1370-4621
1573-773X
DOI:10.1007/s11063-018-09977-1