Back to Search Start Over

Offline A/B testing for Recommender Systems

Authors :
Gilotte, Alexandre
Calauzènes, Clément
Nedelec, Thomas
Abraham, Alexandre
Dollé, Simon
Publication Year :
2018

Abstract

Before A/B testing online a new version of a recommender system, it is usual to perform some offline evaluations on historical data. We focus on evaluation methods that compute an estimator of the potential uplift in revenue that could generate this new technology. It helps to iterate faster and to avoid losing money by detecting poor policies. These estimators are known as counterfactual or off-policy estimators. We show that traditional counterfactual estimators such as capped importance sampling and normalised importance sampling are experimentally not having satisfying bias-variance compromises in the context of personalised product recommendation for online advertising. We propose two variants of counterfactual estimates with different modelling of the bias that prove to be accurate in real-world conditions. We provide a benchmark of these estimators by showing their correlation with business metrics observed by running online A/B tests on a commercial recommender system.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1801.07030
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3159652.3159687