A mixture of multiple linear classifiers with sample weight and manifold regularization

A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches main...

Full description

Saved in:
Bibliographic Details
Published in2017 International Joint Conference on Neural Networks (IJCNN) pp. 3747 - 3752
Main Authors Weite Li, Benhui Chen, Bo Zhou, Jinglu Hu
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A mixture of multiple linear classifiers is famous for its efficiency and effectiveness to tackle nonlinear classification problems. Each classifier contains one linear function multiplied with a gated function, which restricts its corresponding classifier to a local region. Previous researches mainly focus on the partition of local regions, since its quality directly determines the performance of mixture models. However, in real-world data sets, imbalanced and insufficient labeled data are two frequently encountered problems, which also have large influences on the performance of learned classifiers but are seldom considered or explored in the context of mixture models. In this paper, these missing components are introduced into the original formulation of mixture models, namely, a sample weighting scheme for imbalanced data distributions and a manifold regularization to leverage unlabeled data. Then, two solutions with closed form are provided for parameter optimization. Experimental results in the end of our paper exhibit the significance of the added components. As a result, a mixture of multiple linear classifiers can be extended to imbalanced and semi-supervised learning problems.
ISSN:2161-4407
DOI:10.1109/IJCNN.2017.7966328