The authors describe an innovative course at the Harvard Graduate School of Education that places graduate students and public school teachers on school-based teams and asks them to solve real problems using real student data.
STATE accountability systems that are based on test data and the No Child Left Behind Act have put educators under great pressure to improve their students' scores on standardized tests. Much has been written about the possibility that school faculties will resort to "drill and kill," a response that will reduce the quality of children's education. Much less has been written about what it takes for teachers and administrators to be able to use student assessment results to learn about children's skills and about the effectiveness of instruction -- and then to use that learning to guide instructional improvements.
We report here on a yearlong workshop for teachers and administrators from the Boston Public Schools (BPS) and for students from the Harvard Graduate School of Education (HGSE). There are four key elements of our course design: 1) organizing around a clear process, 2) teaching about three kinds of tools, 3) assigning projects that use real school data, and 4) supporting collaborative work. While we by no means claim that our approach is the only way to teach educators to make constructive use of student assessment results, we believe that our experiences provide useful lessons to school and district leaders who want to help educators learn to do this work.
While the workshop described here is a graduate course, it is very different from other courses that teach students how to understand basic statistics, make inferences from data, and use statistical software. What makes it different is its focus on placing data analysis in the greater context of school improvement. Participants are assigned to school-based teams consisting of both BPS faculty members and Harvard graduate students and spend the bulk of the course working with real school data to solve problems. The aim is for participating schools to benefit from a structured and supported opportunity to make progress on work they need to do and for HGSE students to benefit from a truly authentic learning experience.
In the last academic year, workshop participants included 16 teachers and administrators from nine Boston schools and 10 HGSE students, who were matched with the schools to form school-based teams. The schools included four high schools, two middle schools, and three elementary schools (one K-8 and two K-5). Most HGSE students also served as principal or teacher-leader interns at the participating schools, and all of the HGSE students had at least three years of teaching experience. Course participants received one semester of Harvard credit for successfully completing the course, which met for 13 sessions of 21/2 hours each over the course of the school year.
Throughout the workshop, those of us who taught the course collected data to inform the evolving course design and to provide evidence for our research. Sources included online surveys conducted at the beginning, middle, and end of the course; assignments produced by school teams and individuals; and notes from group debriefing sessions, focus groups, and observations during classes and from school site visits. From these varied sources, we came to understand that the main impact of the course was a change in participants' attitudes toward using data. In short, they developed the conviction that using data to reflect upon practice could have a powerful effect on teaching and learning, as well as the confidence that they had gained the skills and knowledge needed to lead this kind of work. Now for some specifics about the four elements of our course design.
Design Element 1. Organizing around a clear process. Our teaching team designed the course syllabus around an improvement cycle that we adapted from the literature on school improvement (Figure 1). …