Back to Search Start Over

On Large-Scale Dynamic Topic Modeling with Nonnegative CP Tensor Decomposition

Authors :
Ahn, Miju
Eikmeier, Nicole
Haddock, Jamie
Kassab, Lara
Kryshchenko, Alona
Leonard, Kathryn
Needell, Deanna
Madushani, R. W. M. A.
Sizikova, Elena
Wang, Chuntian
Publication Year :
2020

Abstract

There is currently an unprecedented demand for large-scale temporal data analysis due to the explosive growth of data. Dynamic topic modeling has been widely used in social and data sciences with the goal of learning latent topics that emerge, evolve, and fade over time. Previous work on dynamic topic modeling primarily employ the method of nonnegative matrix factorization (NMF), where slices of the data tensor are each factorized into the product of lower-dimensional nonnegative matrices. With this approach, however, information contained in the temporal dimension of the data is often neglected or underutilized. To overcome this issue, we propose instead adopting the method of nonnegative CANDECOMP/PARAPAC (CP) tensor decomposition (NNCPD), where the data tensor is directly decomposed into a minimal sum of outer products of nonnegative vectors, thereby preserving the temporal information. The viability of NNCPD is demonstrated through application to both synthetic and real data, where significantly improved results are obtained compared to those of typical NMF-based methods. The advantages of NNCPD over such approaches are studied and discussed. To the best of our knowledge, this is the first time that NNCPD has been utilized for the purpose of dynamic topic modeling, and our findings will be transformative for both applications and further developments.<br />Comment: 23 pages, 29 figures, submitted to Women in Data Science and Mathematics (WiSDM) Workshop Proceedings, "Advances in Data Science", AWM-Springer series

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2001.00631
Document Type :
Working Paper