Back to Search Start Over

Linear-Complexity Relaxed Word Mover's Distance with GPU Acceleration

Authors :
Atasu, Kubilay
Parnell, Thomas
Dünner, Celestine
Sifalakis, Manolis
Pozidis, Haralampos
Vasileiadis, Vasileios
Vlachos, Michail
Berrospi, Cesar
Labbi, Abdel
Publication Year :
2017

Abstract

The amount of unstructured text-based data is growing every day. Querying, clustering, and classifying this big data requires similarity computations across large sets of documents. Whereas low-complexity similarity metrics are available, attention has been shifting towards more complex methods that achieve a higher accuracy. In particular, the Word Mover's Distance (WMD) method proposed by Kusner et al. is a promising new approach, but its time complexity grows cubically with the number of unique words in the documents. The Relaxed Word Mover's Distance (RWMD) method, again proposed by Kusner et al., reduces the time complexity from qubic to quadratic and results in a limited loss in accuracy compared with WMD. Our work contributes a low-complexity implementation of the RWMD that reduces the average time complexity to linear when operating on large sets of documents. Our linear-complexity RWMD implementation, henceforth referred to as LC-RWMD, maps well onto GPUs and can be efficiently distributed across a cluster of GPUs. Our experiments on real-life datasets demonstrate 1) a performance improvement of two orders of magnitude with respect to our GPU-based distributed implementation of the quadratic RWMD, and 2) a performance improvement of three to four orders of magnitude with respect to our distributed WMD implementation that uses GPU-based RWMD for pruning.<br />Comment: To appear in the 2017 IEEE International Conference on Big Data (Big Data 2017) http://cci.drexel.edu/bigdata/bigdata2017/ December 11-14, 2017, Boston, MA, USA

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1711.07227
Document Type :
Working Paper