Generative multi-view and multi-feature learning for classification
•A generative bayesian model is proposed for multi-view and multi-feature learning.•The correlation across various views and features is jointly learned.•The label information is embedded to obtain a more discriminant representation.•The method is simplified to a class-conditional model for the opti...
Saved in:
Published in | Information fusion Vol. 45; pp. 215 - 226 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.01.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •A generative bayesian model is proposed for multi-view and multi-feature learning.•The correlation across various views and features is jointly learned.•The label information is embedded to obtain a more discriminant representation.•The method is simplified to a class-conditional model for the optimization.•Experimental results on five databases show the superiority of the method.
Multi-view based classification has attracted much attention in recent years. In general, an object can be represented with various views or modalities, and the exploitation of correlation across different views would contribute to improving the classification performance. However, each view can also be described with multiple features and this types of data is called multi-view and multi-feature data. Different from many existing multi-view methods which only model multiple views but ignore intrinsic information among the various features in each view, a generative bayesian model is proposed in this paper to not only jointly take the features and views into account, but also learn a discriminant representation across distinctive categories. A latent variable corresponding to each feature in each view is assumed and the raw feature is a projection of the latent variable from a more discriminant space. Particularly, the extracted variables in each view belonging to the same class are encouraged to follow the same gaussian distribution and those belonging to different classes are conducted to follow different distributions, greatly exploiting the label information. To optimize the presented approach, the proposed method is transformed into a class-conditional model and an effective algorithm is designed to alternatively estimate the parameters and variables. The experimental results on the extensive synthetic and four real-world datasets illustrate the effectiveness and superiority of our method compared with the state-of-the-art. |
---|---|
ISSN: | 1566-2535 1872-6305 |
DOI: | 10.1016/j.inffus.2018.02.005 |