Does the Gender of Examiners Influence Their Marking?

Article excerpt

Three awarding bodies - Assessment and Qualifications Alliance (AQA), London Qualifications and Oxford, Cambridge and RSA Examinations (OCR) - in England administer the General Certificate of Secondary Education (GCSE) examination and A levels. GCSE and A level are national assessments normally taken in a series of subjects by sixteen-year-olds and eighteen-yearolds respectively.

Within the field of educational assessment there is a large literature about sex bias studies, for example Spear (1984) and Gipps and Murphy (1994). Studies have researched the sex bias of whole test or individual questions and/or associated mark schemes. The concern is that particular groups, whether they are gender, ethnic or other types of groups, gain lower marks than other groups. Exercising professional judgement is necessary to determine when a disparity is a bias. A major issue for many years was that men gained higher marks than women did on the high-stakes Scholastic Aptitude Test (SAT) which is used to determine college admissions and scholarship awards in the United States (see Lynn and Mau, 2001). The items might be biased due to the emphasis that is placed upon particular skills or the context in which the problem is set. For example, Dwyer (1976, in Murphy, 1978) found that the SAT about mathematics was biased towards males owing to the inclusion of more geometry than algebra problems. Graf and Ridell (1972) found that the same mathematical problem set in a male-friendly and in a female-friendly context resulted in different levels of achievement by the two sexes. Wood (1978) found that girls did better on GCE O-level1 examination questions about females or female-stereotyped contexts, e.g. a girl's ordeal at a dinner dance. Boys performed better on questions about the Crimean War and/or a man looking back on a boyhood spent near a railway line. But Boaler (1994) found that it was the realism of the context as well as the extent of the sex stereotyping in the examination question that affected girls' performance. The literature given above tends to refer to individual questions, but Stobart et al. (1992) show that there is evidence that in GCSE and O levels different types of assessment and different subjects differentially affect the achievement of the sexes. Girls tend to outperform boys in all subjects except the sciences and girls tend to do well on course work but boys tend to do well on objective (multiple-choice) tests. They warn that 'Equal outcomes should not therefore be contrived by manipulation of assessment techniques' (Stobart et al., 1992, p. 261).

Other studies have focused upon another form of bias. Gipps (1994) explains that bias can occur when the score given by an examiner is consciously or unconsciously affected by factors other than the candidate's achievement, e.g. sex, ethnic origin, school, handwriting. In the context of UK higher education Newstead and Dennis (1990) examined inter-rater reliability for blind and non-blind grading. They found that there was no sex bias (i.e. no favouritism towards students of one sex) in the grading of undergraduate students. Baird (1996) investigated sex bias in marking in Chemistry and English Literature A level examinations, using a blind marking approach. She found that marks were not affected by the sex of the examinee. In the case of 'live' GCE and GCSE examinations blind marking would be a considerable logistical challenge.

Examiners can also award marks for answers that illustrate skills, knowledge and/or values irrelevant to the test but valued by the markers. Even in a blind marking scenario the sex of the examinee might be inferred from candidates' handwriting (girls' is perceived to be neater) and stereotypes can come into play. The Scottish Examining Board (1992) investigated marker practices in English and History using scripts that varied in the achievement of the centre (school or college), handwriting, gender and ethnic origin of the candidate. …


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.