Academic journal article
By Klecker, Beverly M.
Journal of Instructional Psychology , Vol. 34, No. 3
This exploratory research study examined the impact of using feedback from weekly multiple-choice tests on final exam scores and students' IDEA course ratings. The teacher researcher taught two sections (N=33; N=34) of a graduate-level, semester-long, online course in advanced human growth and development. Each section had identical course materials on separate electronic Blackboard sites. The treatment, 20-item multiple-choice tests, was randomly assigned to one section. The other section had no weekly multiple-choice tests. Identical final examinations and IDEA course evaluations were used to measure differences in learning and student course satisfaction. Students in the course with the weekly formative tests had statistically significantly (p.<.05) higher final exam scores. There was no statistically significant difference in the students' course ratings.
The online classroom environment provides challenging, unique, and exciting opportunities for assessing student learning. Benson (2003) suggested:
Two key benefits of online assessments are (1) the ability of every learner to respond to every question the instructor asks and (2) the ability of the instructor to provide immediate feedback to each learner. In a traditional course, when the instructor asks a question, the first student to answer is typically afforded the sole opportunity to provide an answer (p. 70).
However, students' response to questions, particularly assessment questions, and especially test questions, varies widely. Ricketts and Wilks (2002) suggested the online methods of assessment may adversely influence student learning. They concluded that all online instructors should conduct action research studies to explore the impact of their assessment methods on measures of student learning and course satisfaction.
The Purpose of the Study
The researcher/instructor had developed the online class in the spring of 2004 and had taught at least one section of the class every semester (including two summer sessions) prior to the study. Each semester, student feedback from course grades, IDEA online student evaluations, and student evaluations using a departmental survey was used to improve the delivery of the course.
The research questions for this study were formed when the researcher had the opportunity to teach two online sections of a graduate-level course in advanced human growth and development spring semester 2006. The two questions were:
1. Does the use of weekly multiple-choice formative tests with immediate feedback impact student learning in the course?
2. Does the use of weekly multiple-choice formative tests with immediate feedback impact student evaluations of the course/instructor?
The importance of prompt feedback in online classroom assessment has been the focus of several recent research studies (e.g., Cashion & Palmieri, 2002; Greenberg, 1998; Shuey, 2002; Siew, 2003). Students in face-to-face classrooms expect graded work to be returned within the week. Because the time parameters are different in online classes, feedback to students can range from instant--for example in a Blackboard-graded multiple-choice exam--to weekly--as in an instructor-graded essay exam. Feedback to students serves as both an extrinsic motivator--when grades are involved--and an intrinsic motivator--when self-correcting is the primary motivating force.
Research on assessment strategies in face-to-face classrooms has found that varying the type and/or frequency of formative assessment results in measurable differences in student learning as measured by summative, final course examinations (e.g., Brookhart, 2000; Stiggins, 1997). There is little research in online classrooms to test the theoretical classroom assessment research in face-to-face classrooms.
The plan for this action research study was reviewed and approved by the IRB at the university. …