Back to Search
Start Over
PQSCT: Pseudo-Siamese BERT for Concept Tagging with Both Questions and Solutions
- Source :
-
IEEE Transactions on Learning Technologies . 2023 16(5):831-846. - Publication Year :
- 2023
-
Abstract
- The global outbreak of the new coronavirus epidemic has promoted the development of intelligent education and the utilization of online learning systems. In order to provide students with intelligent services, such as cognitive diagnosis and personalized exercises recommendation, a fundamental task is the concept tagging for exercises, which extracts knowledge index structures and knowledge representations for exercises. Unfortunately, to the best of our knowledge, existing tagging approaches based on exercise content either ignore multiple components of exercises or ignore that exercises may contain multiple concepts. To this end, in this article, we present a study of concept tagging. First, we propose an improved pretrained bidirectional encoder representations from transformers (BERT) for concept tagging with both questions and solutions (QSCT). Specifically, we design a question-solution prediction task and apply the BERT encoder to combine questions and solutions, ultimately obtaining the final exercise representation through feature augmentation. Then, to further explore the relationship between questions and solutions, we extend the QSCT to a pseudo-siamese BERT for concept tagging with both questions and solutions. We optimize the feature fusion strategy, which integrates five different vector features from local and global into the final exercise representation. Finally, we conduct extensive experiments on real-world datasets, which clearly demonstrate the effectiveness of our proposed models for concept tagging.
Details
- Language :
- English
- ISSN :
- 1939-1382
- Volume :
- 16
- Issue :
- 5
- Database :
- ERIC
- Journal :
- IEEE Transactions on Learning Technologies
- Publication Type :
- Academic Journal
- Accession number :
- EJ1396544
- Document Type :
- Journal Articles<br />Reports - Research
- Full Text :
- https://doi.org/10.1109/TLT.2023.3275707