Academic journal article Australasian Journal of Engineering Education

Peer Assessment of Past Exam Scripts

Academic journal article Australasian Journal of Engineering Education

Peer Assessment of Past Exam Scripts

Article excerpt

1 INTRODUCTION

There is a growing trend in education to use students and the work they produce as a learning resource (Topping, 1998). The area encompasses a wide range of initiatives, from peer assisted study sessions (Miller et al, 2004) to student self-assessment (Read et al, 2004). The traditional approach to peer assessment has a student marking the work of another student. In many cases the work to be marked has a simple layout; for example, a spelling test or a list of multiple choice answers. While it may be debatable that peer assessment at this level encourages deep learning, more sophisticated versions of peer assessment have been shown to assist in the development of critical thinking and to improve learning outcomes (Pinkerton, 2005). It may be argued that one of the fundamental aims of peer or self-assessment is that students learn from their own mistakes and those of others. However, it has been shown that students can struggle to diagnose errors in another's work and may need to be guided through the process (Yerushalmi & Polingher, 2006).

Examination papers have been used as a research resource to determine student misconceptions (Tang & Williams, 2000), and a strong relationship has been measured between a student's ability to develop a comprehensive model exam paper and their final mark in a course of study (Brink et al, 2004). English et al (2006) investigated whether peer assessment of in-course work had an effect on final exam performance. The in-course work was submitted as a MS Word document, and the students who participated only marked the work of one peer. This was randomly selected from the group and therefore did not necessarily highlight any common misconceptions. The results of their study were inconclusive, although the students who participated in the peer assessment exercise demonstrated a small improvement in exam performance. It was also noted that what students liked about the exercise was that it gave them an insight into the marking process and what examiners were looking for.

The project reported in this paper focused on peer assessment of past exam scripts (hand-written solutions taken from the answer booklets of past students), and investigated the use of past scripts as a learning resource in a first-year engineering unit, Engineering Dynamics (ED), at the University of Western Australia. Students were required to assess a number of past exam scripts, and to identify and explain any errors made. They were also required to evaluate the importance of the errors and to give each script a mark. The scripts were chosen to expose students to examples of good and bad problem solving. It was hoped that this would assist them to reflect more fully on their own work and engage a deeper learning approach by requiring students to extend their understanding of the course material (Biggs, 1988). In Anderson & Krathwohl's (2001) revised Bloom's taxonomy of educational objectives for the cognitive domain, the levels of learning are classified as "remember, understand, apply, analyse, evaluate and create". Current assessment methods for ED encompass the first three levels with a small analysis component. By requiring students to interpret and evaluate the work of others, the project was designed to develop the fourth and fifth levels of learning. The exercise also addressed an important skill required of professional engineers: the ability to review another's work. A third aim of the project was to raise the students' awareness of important nontechnical issues such as the legibility of a solution, explicit reasoning and logical layout.

An overview of the unit and how it is currently assessed is given in the next section. This is followed by a description of the assessment task, the results, some observations on the students' marking behaviour, and student feedback on the task. Based on this feedback, the exam scripts of a random sample of students who had completed the assessment task were reviewed. …

Author Advanced search

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.