1. Extreme Multi-label Learning for Semantic Matching in Product Search
- Author
-
Chang, Wei-Cheng, Jiang, Daniel, Yu, Hsiang-Fu, Teo, Choon-Hui, Zhang, Jiong, Zhong, Kai, Kolluri, Kedarnath, Hu, Qie, Shandilya, Nikhil, Ievgrafov, Vyacheslav, Singh, Japinder, and Dhillon, Inderjit S.
- Subjects
Computer Science - Information Retrieval ,Computer Science - Machine Learning - Abstract
We consider the problem of semantic matching in product search: given a customer query, retrieve all semantically related products from a huge catalog of size 100 million, or more. Because of large catalog spaces and real-time latency constraints, semantic matching algorithms not only desire high recall but also need to have low latency. Conventional lexical matching approaches (e.g., Okapi-BM25) exploit inverted indices to achieve fast inference time, but fail to capture behavioral signals between queries and products. In contrast, embedding-based models learn semantic representations from customer behavior data, but the performance is often limited by shallow neural encoders due to latency constraints. Semantic product search can be viewed as an eXtreme Multi-label Classification (XMC) problem, where customer queries are input instances and products are output labels. In this paper, we aim to improve semantic product search by using tree-based XMC models where inference time complexity is logarithmic in the number of products. We consider hierarchical linear models with n-gram features for fast real-time inference. Quantitatively, our method maintains a low latency of 1.25 milliseconds per query and achieves a 65% improvement of Recall@100 (60.9% v.s. 36.8%) over a competing embedding-based DSSM model. Our model is robust to weight pruning with varying thresholds, which can flexibly meet different system requirements for online deployments. Qualitatively, our method can retrieve products that are complementary to existing product search system and add diversity to the match set., Comment: Accepted in KDD 2021 Applied Data Science Track
- Published
- 2021