1. iSee: Advancing Multi-Shot Explainable AI Using Case-based Recommendations
- Author
-
Wijekoon, Anjana, Wiratunga, Nirmalie, Corsar, David, Martin, Kyle, Nkisi-Orji, Ikechukwu, Palihawadana, Chamath, Caro-Martínez, Marta, Díaz-Agudo, Belen, Bridge, Derek, and Liret, Anne
- Subjects
Computer Science - Artificial Intelligence ,Computer Science - Human-Computer Interaction ,Computer Science - Information Retrieval - Abstract
Explainable AI (XAI) can greatly enhance user trust and satisfaction in AI-assisted decision-making processes. Recent findings suggest that a single explainer may not meet the diverse needs of multiple users in an AI system; indeed, even individual users may require multiple explanations. This highlights the necessity for a "multi-shot" approach, employing a combination of explainers to form what we introduce as an "explanation strategy". Tailored to a specific user or a user group, an "explanation experience" describes interactions with personalised strategies designed to enhance their AI decision-making processes. The iSee platform is designed for the intelligent sharing and reuse of explanation experiences, using Case-based Reasoning to advance best practices in XAI. The platform provides tools that enable AI system designers, i.e. design users, to design and iteratively revise the most suitable explanation strategy for their AI system to satisfy end-user needs. All knowledge generated within the iSee platform is formalised by the iSee ontology for interoperability. We use a summative mixed methods study protocol to evaluate the usability and utility of the iSee platform with six design users across varying levels of AI and XAI expertise. Our findings confirm that the iSee platform effectively generalises across applications and its potential to promote the adoption of XAI best practices., Comment: Accepted to appear at the ECAI-PAIS 2024 main conference proceedings
- Published
- 2024