Validating Instructor-Created Multiple-Choice Statics Exams Using the Concept Assessment Tool for Statics
Primary Submission Contact
Faculty Sponsor Email Address
Many instructors use multiple-choice format exams for classroom assessment because they are easy to administer, grade, and return for timely feedback. However, there is debate about how well multiple-choice exams test student knowledge at a higher level beyond simply recalling facts. This study investigates the validity of multiple-choice class exams created by the instructor of an engineering statics course. In order to determine if the exams were testing a high level of conceptual knowledge, students’ scores on the class exams were compared to scores on the Concept Assessment Tool for Statics (CATS), a reliable, validated measure of statics conceptual knowledge. The scores on the class exams were found to be positively correlated with scores on the CATS, indicating that the exams were testing at the same level of knowledge. These results demonstrate an example of validating instructor-created multiple-choice exams with an external measurement tool, such as the CATS.
Green, Theresa and Wertz, Ruth E.H., "Validating Instructor-Created Multiple-Choice Statics Exams Using the Concept Assessment Tool for Statics" (2016). Fall Interdisciplinary Research Symposium. Paper 28.
Additional Presentation Information
This document is currently not available here.