Locally application of naive Bayes for self-training
Semi-supervised algorithms are well-known for their ability to combine both supervised and unsupervised strategies for optimizing their learning ability under the assumption that only a few examples together with their full feature set are given. In such cases, the use of weak learners as base class...
Saved in:
Published in | Evolving systems Vol. 8; no. 1; pp. 3 - 18 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.03.2017
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Semi-supervised algorithms are well-known for their ability to combine both supervised and unsupervised strategies for optimizing their learning ability under the assumption that only a few examples together with their full feature set are given. In such cases, the use of weak learners as base classifiers is usually preferred, since the iterative behavior of semi-supervised schemes require the building of new temporal models during each new iteration. Locally weighted naïve Bayes classifier is such a classifier that encompasses the power of NB and k-NN algorithms. In this work, we have implemented a self-labeled weighted variant of local learner which uses NB as the base classifier of self-training scheme. We performed an in depth comparison with other well-known semi-supervised classification methods on standard benchmark datasets and we reached to the conclusion that the presented technique had better accuracy in most cases. |
---|---|
ISSN: | 1868-6478 1868-6486 |
DOI: | 10.1007/s12530-016-9159-3 |