Validating Instructor-Created Multiple-Choice Statics Exams Using the Concept Assessment Tool for Statics
Primary Submission Contact
Theresa Green
Faculty Sponsor
Ruth Wertz
Faculty Sponsor Email Address
ruth.wertz@valpo.edu
College
Engineering
Department/Program
Mechanical Engineering
Document Type
Poster Presentation
Date
Fall 10-28-2016
Abstract
Many instructors use multiple-choice format exams for classroom assessment because they are easy to administer, grade, and return for timely feedback. However, there is debate about how well multiple-choice exams test student knowledge at a higher level beyond simply recalling facts. This study investigates the validity of multiple-choice class exams created by the instructor of an engineering statics course. In order to determine if the exams were testing a high level of conceptual knowledge, students’ scores on the class exams were compared to scores on the Concept Assessment Tool for Statics (CATS), a reliable, validated measure of statics conceptual knowledge. The scores on the class exams were found to be positively correlated with scores on the CATS, indicating that the exams were testing at the same level of knowledge. These results demonstrate an example of validating instructor-created multiple-choice exams with an external measurement tool, such as the CATS.
Recommended Citation
Green, Theresa and Wertz, Ruth E.H., "Validating Instructor-Created Multiple-Choice Statics Exams Using the Concept Assessment Tool for Statics" (2016). Fall Interdisciplinary Research Symposium. 28.
https://scholar.valpo.edu/fires/28
Additional Presentation Information
Wall Poster