1. SPAKT: A Self-Supervised Pre-TrAining Method for Knowledge Tracing
- Author
-
Yuling Ma, Peng Han, Huiyan Qiao, Chaoran Cui, Yilong Yin, and Dehu Yu
- Subjects
Knowledge tracing ,student performance prediction ,self-supervised learning ,bidirectional encoder representation from transformers (BERT) ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Knowledge tracing (KT) is the core task of computer-aided education systems, and it aims at predicting whether a student can answer the next exercise (i.e., question) correctly based on his/her historical answer records. In recent years, deep neural network-based approaches have been widely developed in KT and achieved promising results. More recently, several researches further boost these KT models via exploiting plentiful relationships including exercise-skill relations (E-S), the exercise similarity (E-E) as well as skill similarity (S-S). However, these relationship information are frequently absent in many real-world educational applications, and it is a labor-intensive work for human experts to label it. Inspired by recent advances in natural language processing domain, we propose a novel pre-training approach, namely as SPAKT, and utilize self-supervised learning to pre-train exercise embedding representation without the need for expensive human-expert annotations in this paper. Contrary to existing pre-training methods that highly rely on manually labeling knowledge about the E-E, S-S, or E-S relationships, the core idea of the proposed SPAKT is to design three self-attention modules to model the E-S, E-E, and S-S relationships, respectively, and all of these three modules can be trained in the self-supervised setting. As a pre-training approach, our SPAKT can be effortlessly incorporated into existing deep neural network-based KT frameworks. We experimentally show that, even without using expensive annotations about the aforementioned three kinds of relationships, our model achieves competitive performance compared with state-of-the-arts. Our algorithm implementations have been made publicly available at https://github.com/Vinci-hp/pretrainKT.
- Published
- 2022
- Full Text
- View/download PDF