Back to Search Start Over

Assessing Psychology Student Applied Knowledge of Statistics via Open-Book Multiple Choice Online Exams

Authors :
Sarven Savia McLinton
Sharon Elizabeth Wells
Source :
Designs for Learning. 2023 15(1):58-69.
Publication Year :
2023

Abstract

Real-world applications of statistics are rarely 'off the top of your head'; however, statistics and research methods courses default to closed-book exams that only test rote learning. Trending research supports open-book exams testing the application of student knowledge rather than memory, however statistics courses in psychology are lagging amidst fears of cheating in online open-book multiple-choice exams. The aim of this study was twofold; first, to develop an online open-book multiple-choice exam that tests the application of psychology statistics and research methods knowledge, and second, to demonstrate that it is just as reliable a source of final grades as traditional closed-book exams. We compared results from a new Applied Exam (N = 104 undergraduate third-year psychology statistics students) with the previous year's Traditional Exam (N = 81), correlating these with Research Report grades (the best course-assessment indicator of real-world performance). Similarly strong positive correlations were observed between the written assessments and the Traditional Exam (0.59**) or Applied Exam (0.54**), and both exams display comparable bell curves for grade differentiation, suggesting we can depend on the new Applied Exam for final course grade data. It also reflects a better alignment with course objectives and graduate qualities for effective problem solving in novel situations. Automated assessment of applied knowledge benefits psychology instructors and organisations in reducing administration, and psychology students by alleviating the anxiety in closed-book invigilated exams. Together this presents an opportunity to improve student outcomes by encouraging the development of real-world skills, preparing them for competitive job markets that value critical thinking.

Details

ISSN :
2001-7480
Volume :
15
Issue :
1
Database :
ERIC
Journal :
Designs for Learning
Publication Type :
Academic Journal
Accession number :
EJ1407104
Document Type :
Journal Articles<br />Reports - Research