Item Difficulty of Multiple Choice Tests Dependant on Different Item Response Formats - an Experiment in Fundamental Research on Psychological Assessment
Kubinger, Klaus D., Gottschall, Christian H., Psychology Science
Multiple choice response formats are problematical as an item is often scored as solved simply because the test-taker is a lucky guesser. Instead of applying pertinent IRT models which take guessing effects into account, a pragmatic approach of re-conceptualizing multiple choice response formats to reduce the chance of lucky guessing is considered. This paper compares the free response format with two different multiple choice formats. A common multiple choice format with a single correct response option and five distractors ("1 of 6") is used, as well as a multiple choice format with five response options, of which any number of the five is correct and the item is only scored as mastered if all the correct response options and none of the wrong ones are marked ("x of 5"). An experiment was designed, using pairs of items with exactly the same content but different response formats. 173 test-takers were randomly assigned to two test booklets of 150 items altogether. Rasch model analyses adduced a fitting item pool, after the deletion of 39 items. The resulting item difficulty parameters were used for the comparison of the different formats. The multiple choice format "1 of 6" differs significantly from "x of 5", with a relative effect of 1.63, while the multiple choice format "x of 5" does not significantly differ from the free response format. Therefore, the lower degree of difficulty of items with the "1 of 6" multiple choice format is an indicator of relevant guessing effects. In contrast the "x of 5" multiple choice format can be seen as an appropriate substitute for free response format.
Key words: multiple choice response format, guessing effect, item difficulty, Rasch model, psychometrics
Computer administration of psychological tests forces test authors to use items with multiple choice response formats instead of a free response format. And for economical reasons, even when a free response format would do, multiple choice tests are usually preferred. This raises the question of whether the same or different measurement dimensions are being used - experiments on learning have shown, that examination of learned materials becomes much easier if simply recognition rather than reproduction of learned material is required -, but this point will be neglected in the following. However, the problem with which we deal in this paper is the guessing phenomenon which applies if a multiple choice response format is used for the items of a test.
In other words, a multiple choice response format is problematical because items are often scored as solved even though a test-taker has a poor level of the ability that is intended to be measured. Empirical experience demonstrates that a prototypical test-taker chooses any one of the response options (i.e. suggested solutions which are offered for choice) by chance if he/she does not know the correct answer - given that refusing to respond does not seem a fair option. Hence, even when referring to a specific test-taker with an ability level of zero and minus infinite, respectively, there is always a larger than zero solution probability for every item. For this probability the term "a priori guessing probability" is used. This probability is conventionally Hk, with k being the number of response options. Very often this probability equates to 1/5, as there is only one correct among five given response options the other four options being so-called 'distractors'. The guessing problem worsens as testtakers with a moderate ability are immediately able to rule-out certain distractors from serious consideration and consequently the actual guessing probability of a certain test-taker is larger than 1/k, sometimes as high as 1/\k-(k-2)] = 1/2.
Of course, such guessing effects diminish the reliability as well as the validity of a test. Even more seriously they indicate unfair consulting. Bearing in mind that according to …
Questia, a part of Gale, Cengage Learning. www.questia.com
Publication information: Article title: Item Difficulty of Multiple Choice Tests Dependant on Different Item Response Formats - an Experiment in Fundamental Research on Psychological Assessment. Contributors: Kubinger, Klaus D. - Author, Gottschall, Christian H. - Author. Journal title: Psychology Science. Volume: 49. Issue: 4 Publication date: October 1, 2007. Page number: 361+. © Not available. Provided by ProQuest LLC. All Rights Reserved.
This material is protected by copyright and, with the exception of fair use, may not be further copied, distributed or transmitted in any form or by any means.