Back to Search Start Over

Multi-armed bandits for performance marketing

Authors :
Gigli, M
Stella, F
Gigli, M
Stella, F
Publication Year :
2024

Abstract

This paper deals with the problem of optimising bids and budgets of a set of digital advertising campaigns. We improve on the current state of the art by introducing support for multi-ad group marketing campaigns and developing a highly data efficient parametric contextual bandit. The bandit, which exploits domain knowledge to reduce the exploration space, is shown to be effective under the following settings; few clicks and/or small conversion rate, short horizon scenarios, rapidly changing markets and low budget. Furthermore, a bootstrapped Thompson sampling algorithm is adapted to fit the parametric bandit. Extensive numerical experiments, performed on synthetic and real-world data, show that, on average, the parametric bandit gains more conversions than state-of-the-art bandits. Gains in performance are particularly high when an optimisation algorithm is needed the most, i.e. with tight budget or many ad groups, though gains are present also in the case of a single-ad group.

Details

Database :
OAIster
Notes :
STAMPA, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1427430710
Document Type :
Electronic Resource