Back to Search Start Over

Less but Enough: Evaluation of Peer Reviews through Pseudo-Labeling with Less Annotated Data

Authors :
Liu, Chengyuan
Doshi, Divyang
Shang, Ruixuan
Cui, Jialin
Jia, Qinjin
Gehringer, Edward
Source :
Journal of Educational Data Mining. 2023 15(2):123-140.
Publication Year :
2023

Abstract

A peer-assessment system provides a structured learning process for students and allows them to write textual feedback on each other's assignments and projects. This helps instructors or teaching assistants perform a more comprehensive evaluation of students' work. However, the contribution of peer assessment to students' learning relies heavily on the quality of the review. Therefore, a thorough evaluation of the quality of peer assessment is essential to assuring that the process will benefit students' learning. Previous studies have focused on applying machine learning to evaluate peer assessment by identifying characteristics of reviews (e.g., Do they mention a problem, make a suggestion, or tell the students where to make a change?). Unfortunately, collecting ground-truth labels of the characteristics is an arbitrary, subjective, and labor-intensive task. Besides in most cases, those labels are assigned by students, not all of whom are reliable as a source of labeling. In this study, we propose a semi-supervised pseudo-labeling approach to build a robust peer assessment evaluation system to utilize large unlabeled datasets along with only a small amount of labeled data. We aim to evaluate the peer assessment from two angles: Detect a problem statement (Does the reviewer mention a problem with the work?) and suggestion (Does the reviewer give a suggestion to the author?)

Details

Language :
English
ISSN :
2157-2100
Volume :
15
Issue :
2
Database :
ERIC
Journal :
Journal of Educational Data Mining
Publication Type :
Academic Journal
Accession number :
EJ1396231
Document Type :
Journal Articles<br />Reports - Research