Semi-Supervised Nearest Mean Classification Through a Constrained Log-Likelihood

We cast a semi-supervised nearest mean classifier, previously introduced by the first author, in a more principled log-likelihood formulation that is subject to constraints. This, in turn, leads us to make the important suggestion to not only investigate error rates of semi-supervised learners but a...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 26; no. 5; pp. 995 - 1006
Main Authors Loog, Marco, Jensen, Are Charles
Format Journal Article
LanguageEnglish
Published United States IEEE 01.05.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We cast a semi-supervised nearest mean classifier, previously introduced by the first author, in a more principled log-likelihood formulation that is subject to constraints. This, in turn, leads us to make the important suggestion to not only investigate error rates of semi-supervised learners but also consider the risk they originally aim to optimize. We demonstrate empirically that in terms of classification error, mixed results are obtained when comparing supervised to semi-supervised nearest mean classification, while in terms of log-likelihood on the test set, the semi-supervised method consistently outperforms its supervised counterpart. Comparisons to self-learning, a standard approach in semi-supervised learning, are included to further clarify the way, in which our constrained nearest mean classifier improves over regular, supervised nearest mean classification.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2014.2329567