Back to Search Start Over

Unique and Randomized Quiz Generation for Enhanced Learning.

Authors :
Burns, Mark A.
Johnson, Valerie N.
Smith, Kaylee
Source :
Proceedings of the ASEE Annual Conference & Exposition. 2022, p1-11. 11p.
Publication Year :
2022

Abstract

Assessment of student learning is difficult in even the best of times. During the pandemic, when most classes pivoted to remote instruction in a span of days, administering assessments such as quizzes and exams became even more complicated. Answer sharing and web searches, things that are relatively easy to control during an in-person exam, are next to impossible to monitor in a remote situation. Even with exams in a physical classroom, almost all exams are a single version, suggesting that some students may be tempted to exchange numbers related to specific problems. Finally, in both exams and homework problem sets, the answers are typically provided several days to weeks after the assessment is submitted, severely limiting any iterative learning process. To address these issues, we developed a set of Excel files that allow an individualized assessment experience for each student. Data for each question were loaded into various Excel sheets, and multiple questions could be combined to produce a quiz or exam. Different format questions were supported: True/False, Multiple Choice, Fill in the Blank, and Calculation. The Excel file used the input information to produce hundreds of unique question variations, thus providing a unique (i.e., different question and different answer) assessment for each student. A text file of these questions was then converted to a QTI .zip file using the Python code text2qti (GitHub) for eventual loading into the Quizzes page in Canvas. The advantages of the Excel files combined with Canvas operation were significant, enabling the time-effective generation of thousands of unique Canvas quizzes and offering immediate feedback for students. The unique quizzes allowed students to discuss the problems without sharing answers and enabled learners' multiple attempts on different quiz versions for each assessment. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
21535868
Database :
Academic Search Index
Journal :
Proceedings of the ASEE Annual Conference & Exposition
Publication Type :
Conference
Accession number :
172835063