Back to Search Start Over

Classifying Math KCs via Task-Adaptive Pre-Trained BERT

Authors :
Shen, Jia Tracy
Yamashita, Michiharu
Prihar, Ethan
Heffernan, Neil
Wu, Xintao
McGrew, Sean
Lee, Dongwon
Publication Year :
2021

Abstract

Educational content labeled with proper knowledge components (KCs) are particularly useful to teachers or content organizers. However, manually labeling educational content is labor intensive and error-prone. To address this challenge, prior research proposed machine learning based solutions to auto-label educational content with limited success. In this work, we significantly improve prior research by (1) expanding the input types to include KC descriptions, instructional video titles, and problem descriptions (i.e., three types of prediction task), (2) doubling the granularity of the prediction from 198 to 385 KC labels (i.e., more practical setting but much harder multinomial classification problem), (3) improving the prediction accuracies by 0.5-2.3% using Task-adaptive Pre-trained BERT, outperforming six baselines, and (4) proposing a simple evaluation measure by which we can recover 56-73% of mispredicted KC labels. All codes and data sets in the experiments are available at:https://github.com/tbs17/TAPT-BERT

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2105.11343
Document Type :
Working Paper