Back to Search
Start Over
A Sim2Real Approach for Identifying Task-Relevant Properties in Interpretable Machine Learning
- Publication Year :
- 2024
-
Abstract
- Explanations of an AI's function can assist human decision-makers, but the most useful explanation depends on the decision's context, referred to as the downstream task. User studies are necessary to determine the best explanations for each task. Unfortunately, testing every explanation and task combination is impractical, especially considering the many factors influencing human+AI collaboration beyond the explanation's content. This work leverages two insights to streamline finding the most effective explanation. First, explanations can be characterized by properties, such as faithfulness or complexity, which indicate if they contain the right information for the task. Second, we introduce XAIsim2real, a pipeline for running synthetic user studies. In our validation study, XAIsim2real accurately predicts user preferences across three tasks, making it a valuable tool for refining explanation choices before full studies. Additionally, it uncovers nuanced relationships, like how cognitive budget limits a user's engagement with complex explanations -- a trend confirmed with real users.
- Subjects :
- Computer Science - Human-Computer Interaction
Computer Science - Machine Learning
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2406.00116
- Document Type :
- Working Paper