Back to Search Start Over

Do Offline Metrics Predict Online Performance in Recommender Systems?

Authors :
Krauth, Karl
Dean, Sarah
Zhao, Alex
Guo, Wenshuo
Curmei, Mihaela
Recht, Benjamin
Jordan, Michael I.
Publication Year :
2020

Abstract

Recommender systems operate in an inherently dynamical setting. Past recommendations influence future behavior, including which data points are observed and how user preferences change. However, experimenting in production systems with real user dynamics is often infeasible, and existing simulation-based approaches have limited scale. As a result, many state-of-the-art algorithms are designed to solve supervised learning problems, and progress is judged only by offline metrics. In this work we investigate the extent to which offline metrics predict online performance by evaluating eleven recommenders across six controlled simulated environments. We observe that offline metrics are correlated with online performance over a range of environments. However, improvements in offline metrics lead to diminishing returns in online performance. Furthermore, we observe that the ranking of recommenders varies depending on the amount of initial offline data available. We study the impact of adding exploration strategies, and observe that their effectiveness, when compared to greedy recommendation, is highly dependent on the recommendation algorithm. We provide the environments and recommenders described in this paper as Reclab: an extensible ready-to-use simulation framework at https://github.com/berkeley-reclab/RecLab.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2011.07931
Document Type :
Working Paper