Back to Search Start Over

Explaining Bayesian Optimization by Shapley Values Facilitates Human-AI Collaboration

Authors :
Rodemann, Julian
Croppi, Federico
Arens, Philipp
Sale, Yusuf
Herbinger, Julia
Bischl, Bernd
Hüllermeier, Eyke
Augustin, Thomas
Walsh, Conor J.
Casalicchio, Giuseppe
Publication Year :
2024

Abstract

Bayesian optimization (BO) with Gaussian processes (GP) has become an indispensable algorithm for black box optimization problems. Not without a dash of irony, BO is often considered a black box itself, lacking ways to provide reasons as to why certain parameters are proposed to be evaluated. This is particularly relevant in human-in-the-loop applications of BO, such as in robotics. We address this issue by proposing ShapleyBO, a framework for interpreting BO's proposals by game-theoretic Shapley values.They quantify each parameter's contribution to BO's acquisition function. Exploiting the linearity of Shapley values, we are further able to identify how strongly each parameter drives BO's exploration and exploitation for additive acquisition functions like the confidence bound. We also show that ShapleyBO can disentangle the contributions to exploration into those that explore aleatoric and epistemic uncertainty. Moreover, our method gives rise to a ShapleyBO-assisted human machine interface (HMI), allowing users to interfere with BO in case proposals do not align with human reasoning. We demonstrate this HMI's benefits for the use case of personalizing wearable robotic devices (assistive back exosuits) by human-in-the-loop BO. Results suggest human-BO teams with access to ShapleyBO can achieve lower regret than teams without.<br />Comment: Preprint. Copyright by the authors. 19 pages, 24 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.04629
Document Type :
Working Paper