Transfer Neural Trees: Semi-Supervised Heterogeneous Domain Adaptation and Beyond

Heterogeneous domain adaptation (HDA) addresses the task of associating data not only across dissimilar domains but also described by different types of features. Inspired by the recent advances of neural networks and deep learning, we propose a deep leaning model of transfer neural trees (TNT), whi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 28; no. 9; pp. 4620 - 4633
Main Authors Wei-Yu Chen, Hsu, Tzu-Ming Harry, Tsai, Yao-Hung Hubert, Ming-Syan Chen, Wang, Yu-Chiang Frank
Format Journal Article
LanguageEnglish
Published United States IEEE 01.09.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Heterogeneous domain adaptation (HDA) addresses the task of associating data not only across dissimilar domains but also described by different types of features. Inspired by the recent advances of neural networks and deep learning, we propose a deep leaning model of transfer neural trees (TNT), which jointly solves cross-domain feature mapping, adaptation, and classification in a unified architecture. As the prediction layer in TNT, we introduce transfer neural decision forest (transfer-NDF), which is able to learn the neurons in TNT for adaptation by stochastic pruning. In order to handle semi-supervised HDA, a unique embedding loss term is introduced to TNT for preserving prediction and structural consistency between labeled and unlabeled target-domain data. Furthermore, we show that our TNT can be extended to zero shot learning for associating image and attribute data with promising performance. Finally, experiments on different classification tasks across features, datasets, and modalities would verify the effectiveness of our TNT.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2019.2912126