Parallel Vector Field Regularized Non-Negative Matrix Factorization for Image Representation

Non-negative Matrix Factorization (NMF) is a popular model in machine learning, which can learn parts-based representation by seeking for two non-negative matrices whose product can best approximate the original matrix. However, the manifold structure is not considered by NMF and many of the existin...

Full description

Saved in:
Bibliographic Details
Published in2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 2216 - 2220
Main Authors Peng, Yong, Tang, Rixin, Kong, Wanzeng, Qin, Feiwei, Nie, Feiping
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.04.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Non-negative Matrix Factorization (NMF) is a popular model in machine learning, which can learn parts-based representation by seeking for two non-negative matrices whose product can best approximate the original matrix. However, the manifold structure is not considered by NMF and many of the existing work use the graph Laplacian to ensure the smoothness of the learned representation coefficients on the data manifold. Further, beyond smoothness, it is suggested by recent theoretical work that we should ensure second order smoothness for the NMF mapping, which measures the linearity of the NMF mapping along the data manifold. Based on the equivalence between the gradient field of a linear function and a parallel vector field, we propose to find the NMF mapping which minimizes the approximation error, and simultaneously requires its gradient field to be as parallel as possible. The continuous objective function on the manifold can be discretized and optimized under the general NMF framework. Extensive experimental results suggest that the proposed parallel field regularized NMF provides a better data representation and achieves higher accuracy in image clustering.
ISSN:2379-190X
DOI:10.1109/ICASSP.2018.8462486