Multi-task learning with one-class SVM

Multi-task learning technologies have been developed to be an effective way to improve the generalization performance by training multiple related tasks simultaneously. The determination of the relatedness between tasks is usually the key to the formulation of a multi-task learning method. In this p...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 133; pp. 416 - 426
Main Authors He, Xiyan, Mourot, Gilles, Maquin, Didier, Ragot, José, Beauseroy, Pierre, Smolarz, André, Grall-Maës, Edith
Format Journal Article
LanguageEnglish
Published Amsterdam Elsevier B.V 10.06.2014
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Multi-task learning technologies have been developed to be an effective way to improve the generalization performance by training multiple related tasks simultaneously. The determination of the relatedness between tasks is usually the key to the formulation of a multi-task learning method. In this paper, we make the assumption that when tasks are related to each other, usually their models are close enough, that is, their models or their model parameters are close to a certain mean function. Following this task relatedness assumption, two multi-task learning formulations based on one-class support vector machines (one-class SVM) are presented. With the help of new kernel design, both multi-task learning methods can be solved by the optimization program of a single one-class SVM. Experiments conducted on both low-dimensional nonlinear toy dataset and high-dimensional textured images show that our approaches lead to very encouraging results.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2013.12.022