Back to Search
Start Over
GELT: A graph embeddings based lite-transformer for knowledge tracing.
- Source :
- PLoS ONE; 5/7/2024, Vol. 19 Issue 5, p1-22, 22p
- Publication Year :
- 2024
-
Abstract
- The development of intelligent education has led to the emergence of knowledge tracing as a fundamental task in the learning process. Traditionally, the knowledge state of each student has been determined by assessing their performance in previous learning activities. In recent years, Deep Learning approaches have shown promising results in capturing complex representations of human learning activities. However, the interpretability of these models is often compromised due to the end-to-end training strategy they employ. To address this challenge, we draw inspiration from advancements in graph neural networks and propose a novel model called GELT (Graph Embeddings based Lite-Transformer). The purpose of this model is to uncover and understand the relationships between skills and questions. Additionally, we introduce an energy-saving attention mechanism for predicting knowledge states that is both simple and effective. This approach maintains high prediction accuracy while significantly reducing computational costs compared to conventional attention mechanisms. Extensive experimental results demonstrate the superior performance of our proposed model compared to other state-of-the-art baselines on three publicly available real-world datasets for knowledge tracking. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 19326203
- Volume :
- 19
- Issue :
- 5
- Database :
- Complementary Index
- Journal :
- PLoS ONE
- Publication Type :
- Academic Journal
- Accession number :
- 177089589
- Full Text :
- https://doi.org/10.1371/journal.pone.0301714