Back to Search Start Over

Survival Multiarmed Bandits with Bootstrapping Methods

Authors :
Veroutis, Peter
Godin, Frédéric
Publication Year :
2024

Abstract

The Multiarmed Bandits (MAB) problem has been extensively studied and has seen many practical applications in a variety of fields. The Survival Multiarmed Bandits (S-MAB) open problem is an extension which constrains an agent to a budget that is directly related to observed rewards. As budget depletion leads to ruin, an agent's objective is to both maximize expected cumulative rewards and minimize the probability of ruin. This paper presents a framework that addresses such a dual goal using an objective function balanced by a ruin aversion component. Action values are estimated through a novel approach which consists of bootstrapping samples from previously observed rewards. In numerical experiments, the policies we present outperform benchmarks from the literature.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.16486
Document Type :
Working Paper