A brief review on multi-task learning
Multi-task learning (MTL), which optimizes multiple related learning tasks at the same time, has been widely used in various applications, including natural language processing, speech recognition, computer vision, multimedia data processing, biomedical imaging, socio-biological data analysis, multi...
Saved in:
Published in | Multimedia tools and applications Vol. 77; no. 22; pp. 29705 - 29725 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.11.2018
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Multi-task learning (MTL), which optimizes multiple related learning tasks at the same time, has been widely used in various applications, including natural language processing, speech recognition, computer vision, multimedia data processing, biomedical imaging, socio-biological data analysis, multi-modality data analysis, etc. MTL sometimes is also referred to as joint learning, and is closely related to other machine learning subfields like multi-class learning, transfer learning, and learning with auxiliary tasks, to name a few. In this paper, we provide a brief review on this topic, discuss the motivation behind this machine learning method, compare various MTL algorithms, review MTL methods for incomplete data, and discuss its application in deep learning. We aim to provide the readers with a simple way to understand MTL without too many complicated equations, and to help the readers to apply MTL in their applications. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-018-6463-x |