Students' Reactions to Written Test Item Rebuttals

Article excerpt

Multiple choice tests have the benefit of having one correct option per question, thus allowing for convenient machine scoring. One disadvantage, however, is that students do not have the opportunity to explain why they selected a distracter, even though their reasoning could demonstrate content knowledge. I have experimented with allowing students to argue test questions in writing. By reviewing students' remarks, I have the opportunity to understand students' reasoning and make corrections to my test items.


Multiple choice tests are ubiquitous in college, especially in large introductory classes (Paxton, 2000). Tests composed of multiple choice items have a number of advantages: ease and objectivity in scoring, potential to cover a large amount of content, and greater difficulty for students in guessing the correct answer than true-false (T-F) questions. Despite the benefits of multiple choice testing, difficulties with the method exist.

One of the potential problems of the multiple choice test is that although one option per item is keyed as correct, students may select a distracter for legitimate reasons. For example, students' comments during class discussion may have included one of the incorrect answers. Several authors (Dodd & Leal, 1988; Nield & Wintre, 1986) have examined the use of multiple choice tests, and have proposed similar approaches to the problem of "tricky" multiple choice questions.

Nield and Wintre experimented with an "E" option on four-option multiple choice tests. In this format, students could choose one of the four options and if desired, explain a marked answer by also marking the "E" option. First, instructors graded exams with the scoring key; then they reviewed the "E" options marked. Students received a point for a good explanation of a wrong answer, but lost a point for a bad explanation of a correctly marked answer. Dodd and Leal's "answer justification" technique works in a similar manner. While taking the exam, students write justifications for their answers to those questions they see as tricky. Dodd and Leal first grade the exam and then review the students' justifications for the questions they missed. Students responded favorably to the "E" and answer justification options, reported less anxiety in taking multiple choice exams, and considered the "E" option multiple choice exam the least frustrating type of exam to take when compared with true-false, short answer, and essay.

These two approaches yield the desired outcome of students demonstrating content knowledge, but are time consuming in a large section class, especially if one does not have a teaching assistant (Dodd & Leal, 1988, note their technique requires less scoring time than the Nield & Wintre, 1986, technique). The technique I describe should take less time than either the Dodd and Leal or Nield and Wintre procedures.

My method combines the Nield and Wintre and Dodd and Leal benefits of providing students an opportunity to explain their answer with the notion that tests can serve as learning events (Foley, 1981). Foley permitted students to take exams home to research answers to the questions they did not know and then retake the test (Toma & Heady, 1996, describe a similar technique). Foley' s approach w as popular with students, but it seems impractical to create new items for every new class. What I have done is encourage students to review the tests in my office, on library reserve, or during a specified class and prepare written rebuttals to questions in which they explain their rationale for selecting an incorrect option. I review these rebuttals to determine if students' responses justify awarding more points. I have used this method for several years and have collected responses on a brief survey to assess student's reactions.



Participants were 238 students enrolled in my classes over a three year period. …