Back to Search
Start Over
Entropy Minimizing Matrix Factorization
- Publication Year :
- 2021
-
Abstract
- Nonnegative Matrix Factorization (NMF) is a widely-used data analysis technique, and has yielded impressive results in many real-world tasks. Generally, existing NMF methods represent each sample with several centroids, and find the optimal centroids by minimizing the sum of the approximation errors. However, the outliers deviating from the normal data distribution may have large residues, and then dominate the objective value seriously. In this study, an Entropy Minimizing Matrix Factorization framework (EMMF) is developed to tackle the above problem. Considering that the outliers are usually much less than the normal samples, a new entropy loss function is established for matrix factorization, which minimizes the entropy of the residue distribution and allows a few samples to have large approximation errors. In this way, the outliers do not affect the approximation of the normal samples. The multiplicative updating rules for EMMF are also designed, and the convergence is proved both theoretically and experimentally. In addition, a Graph regularized version of EMMF (G-EMMF) is also presented to deal with the complex data structure. Clustering results on various synthetic and real-world datasets demonstrate the reasonableness of the proposed models, and the effectiveness is also verified through the comparison with the state-of-the-arts.
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2103.13487
- Document Type :
- Working Paper