QLTL: a Simple yet Efficient Algorithm for Semi-Supervised Transfer Learning
Most machine learning techniques rely on the assumption that training and target data share a similar underlying distribution. When this assumption is violated, they usually fail to generalise; this is one of the situations tackled by transfer learning: achieving good classification performances on...
Saved in:
Published in | 10th International Conference on Pattern Recognition Systems (ICPRS-2019) pp. 6 (30 - 35 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
Stevenage, UK
IET
2019
The Institution of Engineering & Technology |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Most machine learning techniques rely on the assumption that training and target data share a similar underlying distribution. When this assumption is violated, they usually fail to generalise; this is one of the situations tackled by transfer learning: achieving good classification performances on different-butrelated datasets. In this paper, we consider the specific case where the task is unique, and where the training set(s) and the target set share a similar-but-different underlying distribution. Our method, QLTL: Quadratic Loss Transfer Learning, constitutes semi-supervised learning: we train a set of classifiers on the available training data in order to input knowledge, and we use a centred kernel polarisation criterion as a way to correct the density probability function shift between training and target data. Our method results in a convex problem, leading to an analytic solution. We show encouraging results on a toy example with covariate shift, and good performances on a text-document classification task, relatively to recent algorithms. |
---|---|
ISBN: | 1839531088 9781839531088 |
DOI: | 10.1049/cp.2019.0244 |