Nonparametric Sparse Matrix Decomposition for Cross-View Dimensionality Reduction

Cross-view data are collected from two different views or sources about the same subjects. As the information from these views often consolidate and/or complement each other, cross-view data analysis can gain more insights for decision making. A main challenge of cross-view data analysis is how to e...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on multimedia Vol. 19; no. 8; pp. 1848 - 1859
Main Authors Huawen Liu, Lin Liu, Thuc Duy Le, Lee, Ivan, Shiliang Sun, Jiuyong Li
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.08.2017
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Cross-view data are collected from two different views or sources about the same subjects. As the information from these views often consolidate and/or complement each other, cross-view data analysis can gain more insights for decision making. A main challenge of cross-view data analysis is how to effectively explore the inherently correlated and high-dimensional data. Dimension reduction offers an effective solution for this problem. However, how to choose right models and parameters involved for dimension reduction is still an open problem. In this paper, we propose an effective sparse learning algorithm for cross-view dimensionality reduction. A distinguished character of our model selection is that it is nonparametric and automatic. Specifically, we represent the correlation of cross-view data using a covariance matrix. Then, we decompose the matrix into a sequence of low-rank ones by solving an optimization problem in an alternating least squares manner. More importantly, a new and nonparametric sparsity-inducing function is developed to derive a parsimonious model. Extensive experiments are conducted on real-world data sets to evaluate the effectiveness of the proposed algorithm. The results show that our method is competitive with the state-of-the-art sparse learning algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2017.2683258