Back to Search Start Over

Rankitect: Ranking Architecture Search Battling World-class Engineers at Meta Scale

Authors :
Wen, Wei
Liu, Kuang-Hung
Fedorov, Igor
Zhang, Xin
Yin, Hang
Chu, Weiwei
Hassani, Kaveh
Sun, Mengying
Liu, Jiang
Wang, Xu
Jiang, Lin
Chen, Yuxin
Zhang, Buyun
Liu, Xi
Cheng, Dehua
Chen, Zhengxing
Zhao, Guang
Han, Fangqiu
Yang, Jiyan
Hao, Yuchen
Xiong, Liang
Chen, Wen-Yen
Publication Year :
2023

Abstract

Neural Architecture Search (NAS) has demonstrated its efficacy in computer vision and potential for ranking systems. However, prior work focused on academic problems, which are evaluated at small scale under well-controlled fixed baselines. In industry system, such as ranking system in Meta, it is unclear whether NAS algorithms from the literature can outperform production baselines because of: (1) scale - Meta ranking systems serve billions of users, (2) strong baselines - the baselines are production models optimized by hundreds to thousands of world-class engineers for years since the rise of deep learning, (3) dynamic baselines - engineers may have established new and stronger baselines during NAS search, and (4) efficiency - the search pipeline must yield results quickly in alignment with the productionization life cycle. In this paper, we present Rankitect, a NAS software framework for ranking systems at Meta. Rankitect seeks to build brand new architectures by composing low level building blocks from scratch. Rankitect implements and improves state-of-the-art (SOTA) NAS methods for comprehensive and fair comparison under the same search space, including sampling-based NAS, one-shot NAS, and Differentiable NAS (DNAS). We evaluate Rankitect by comparing to multiple production ranking models at Meta. We find that Rankitect can discover new models from scratch achieving competitive tradeoff between Normalized Entropy loss and FLOPs. When utilizing search space designed by engineers, Rankitect can generate better models than engineers, achieving positive offline evaluation and online A/B test at Meta scale.<br />Comment: Wei Wen and Kuang-Hung Liu contribute equally

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.08430
Document Type :
Working Paper