Back to Search Start Over

Prediction of Students' Adaptability Using Explainable AI in Educational Machine Learning Models.

Authors :
Nnadi, Leonard Chukwualuka
Watanobe, Yutaka
Rahman, Md. Mostafizer
John-Otumu, Adetokunbo Macgregor
Source :
Applied Sciences (2076-3417); Jun2024, Vol. 14 Issue 12, p5141, 26p
Publication Year :
2024

Abstract

As the educational landscape evolves, understanding and fostering student adaptability has become increasingly critical. This study presents a comparative analysis of XAI techniques to interpret machine learning models aimed at classifying student adaptability levels. Leveraging a robust dataset of 1205 instances, we employed several machine learning algorithms with a particular focus on Random Forest, which demonstrated highest accuracy at 91%. The models' precision, recall and F1-score were also evaluated, with Random Forest achieving a precision of 0.93, a recall of 0.94, and an F1-score of 0.94. Our study utilizes SHAP, LIME, Anchors, ALE, and Counterfactual explanations to reveal the specific contributions of various features impacting adaptability predictions. SHAP values highlighted 'Class Duration' significance (mean SHAP value: 0.175); LIME explained socio-economic and institutional factors' intricate influence. Anchors provided high-confidence rule-based explanations (confidence: 97.32%), emphasizing demographic characteristics. ALE analysis underscored the importance of 'Financial Condition' with a positive slope, while Counterfactual scenarios highlighted the impact of slight feature variations of 0.5 change in 'Class Duration'. Consistently, 'Class Duration' and 'Financial Condition' emerge as key factors, while the study also underscores the subtle effects of 'Institution Type' and 'Load-shedding'. This multi-faceted interpretability approach bridges the gap between machine learning performance and educational relevance, presenting a model that not only predicts but also explains the dynamic factors influencing student adaptability. The synthesized insights advocate for educational policies accommodating socioeconomic factors, instructional time, and infrastructure stability to enhance student adaptability. The implications extend to informed and personalized educational interventions, fostering an adaptable learning environment. This methodical research contributes to responsible AI application in education, promoting predictive and interpretable models for equitable and effective educational strategies. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20763417
Volume :
14
Issue :
12
Database :
Complementary Index
Journal :
Applied Sciences (2076-3417)
Publication Type :
Academic Journal
Accession number :
178158139
Full Text :
https://doi.org/10.3390/app14125141