Validating Instructor-Created Multiple-Choice Statics Exams Using the Concept Assessment Tool for Statics

Primary Submission Contact

Theresa Green

Faculty Sponsor

Ruth Wertz

Faculty Sponsor Email Address





Mechanical Engineering

Document Type

Poster Presentation


Fall 10-28-2016


Many instructors use multiple-choice format exams for classroom assessment because they are easy to administer, grade, and return for timely feedback. However, there is debate about how well multiple-choice exams test student knowledge at a higher level beyond simply recalling facts. This study investigates the validity of multiple-choice class exams created by the instructor of an engineering statics course. In order to determine if the exams were testing a high level of conceptual knowledge, students’ scores on the class exams were compared to scores on the Concept Assessment Tool for Statics (CATS), a reliable, validated measure of statics conceptual knowledge. The scores on the class exams were found to be positively correlated with scores on the CATS, indicating that the exams were testing at the same level of knowledge. These results demonstrate an example of validating instructor-created multiple-choice exams with an external measurement tool, such as the CATS.

Additional Presentation Information

Wall Poster

This document is currently not available here.