An Entropy Weighted Nonnegative Matrix Factorization Algorithm for Feature Representation

Nonnegative matrix factorization (NMF) has been widely used to learn low-dimensional representations of data. However, NMF pays the same attention to all attributes of a data point, which inevitably leads to inaccurate representations. For example, in a human-face dataset, if an image contains a hat...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 34; no. 9; pp. 5381 - 5391
Main Authors Wei, Jiao, Tong, Can, Wu, Bingxue, He, Qiang, Qi, Shouliang, Yao, Yudong, Teng, Yueyang
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.09.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Nonnegative matrix factorization (NMF) has been widely used to learn low-dimensional representations of data. However, NMF pays the same attention to all attributes of a data point, which inevitably leads to inaccurate representations. For example, in a human-face dataset, if an image contains a hat on a head, the hat should be removed or the importance of its corresponding attributes should be decreased during matrix factorization. This article proposes a new type of NMF called entropy weighted NMF (EWNMF), which uses an optimizable weight for each attribute of each data point to emphasize their importance. This process is achieved by adding an entropy regularizer to the cost function and then using the Lagrange multiplier method to solve the problem. Experimental results with several datasets demonstrate the feasibility and effectiveness of the proposed method. The code developed in this study is available at https://github.com/Poisson-EM/Entropy-weighted-NMF .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2022.3184286