1. Measuring More than Content: The Development and Testing of a Diagnostic Instrument Designed to Assess Student Physics Mindware and Reasoning Consistency
- Author
-
Santangelo, Brianna
- Abstract
Over the past few decades, physics education researchers (PER) developed a variety of assessments instrument for research and instructional purposes. Most of these instruments are designed to measure topical understanding. Generally, if student performance is satisfactory, it is assumed that the students possess the content understanding and that the instruction was successful. If performance is less satisfactory, it is assumed that the student understanding is weak and significant changes in instruction are required. The challenge with this assessment approach is that the term "student understanding" is not well defined. Moreover, factors that contribute to improved student understanding have not been examined explicitly and systematically. Recently, PER researchers started integrating findings from cognitive psychology to explore the impact of specific factors on student performance in physics, including the strength of content knowledge (or mindware), reasoning skills, and cognitive reflection tendencies. Traditional assessments do not differentiate between these (or other) factors. This thesis describes work that uses Dual Process Theories of Reasoning as a theoretical framework to investigate patterns in student performance in physics, disentangles various factors that impact performance, discusses the development of a diagnostic instrument designed to measure changes in content knowledge and reasoning necessary to apply this knowledge correctly. Specifically, we examined to what degree student reasoning patterns are consistent with the reasoning pathways predicted by DPToR. Next, we applied a screening-target methodology to disentangle student content knowledge (i.e., mindware) from their reasoning approaches on a sequence of 11 screening-target pairs of physics tasks. We explored the relationships among mindware, reasoning, and tendencies toward cognitive reflection. We found that most item pairs function as expected: screening item performance predicts performance on a target item and that CRT performance is linked to that on target items under certain conditions. Finally, we explored the use of hierarchical cluster analysis to categorize changes in item profiles that measure content knowledge and reasoning consistency with that content knowledge. We found that the instrument could differentiate students along these axes but not into the groups expected by DPToR. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.]
- Published
- 2023