In the fall of the 1999-2000 school year, the Rosedale Union School District took a bold step in critical self-evaluation. At the request of the superintendent, the Board of Trustees approved a comprehensive curriculum management audit.
A curriculum audit is a powerful, comprehensive management tool that gives schools and their communities the impetus and means to systematically design and align curriculum, instruction and assessment. Such a system enables the school district to make maximum use of its resources in the education of its students.
In the spring of 2000, the audit team offered written recommendations for the improvement of the Rosedale Union School District. One of the most ambitious of the recommendations was to "establish and implement a comprehensive district assessment program to provide meaningful data for decision-making in student learning, program evaluation and the improvement of teaching." Thus began a 24-month timeline of change.
Motivation for change
The results of the curriculum audit were shared with the Board of Trustees, administrative staff, teachers and community members (via a district newsletter). A committee representing all stakeholders was then gathered to recommend a plan of action. Without exception, the primary recommendation of the committee was to allocate all resources necessary to develop a student-assessment program aligned with state standards as well as the written and taught curriculum.
The stakeholder group called for an evaluation tool that would be administered multiple times per year (to coincide with grading periods), and to provide feedback on the effectiveness of instruction. The committee agreed that a pilot year for development and field testing should be allowed.
Preparation for change
With help from our assistant superintendent of curriculum, we began a search for a published assessment program to monitor the district's 3,700 first-through eighth-grade students. We quickly discovered two things:
1. For our size district it could easily cost more than a half million dollars to purchase an instrument linked to the standards in the areas of reading and math.
2. The reported reliability and validity of most assessment instruments were too low for high-stakes decision making. Although we initially believed finding an assessment tool would be the least of our worries, this part of the change process took approximately nine months.
Given the tough financial times we all face, our superintendent quickly ruled out the very expensive programs. The remaining choices left a lot to be desired. Any way we looked at it we knew whatever was purchased, we would still have a lot of work ahead. Finally, the district decided to purchase an item bank of standards-based test questions.
Along with this purchase, we moved a classroom teacher into the district office and placed him on special assignment for one year. The teacher we moved had extensive knowledge of both the California Standards and computer technology. Our teacher spent the next 10 months in test development, field-testing, providing teacher in-service and monitoring test results for any unseen problems.
What to test
The easy answer to what to test is, of course, the California content standards. Our district chose to address reading, writing and math proficiencies. The challenge was less what to test, and more how and when to test.
At the beginning of the 2001-2002 school year, we asked for teacher representation from each of our then seven schools to help determine a scope and sequence for our test. The committee ended up having one teacher per grade level from each site at the elementary level and two teachers per grade at the middle school (one reading, one math). There were two such meetings.
The primary task for our large-scale committee was to have these representative teachers return to their individual sites and meet with grade-level staffs. It was then up to the grade levels to determine when each standard was being introduced during the year and when to expect student mastery.
These meetings gave us information we had not expected; it quickly became obvious that we were not assessing "benchmarks," as we had planned. Benchmark testing would indicate mastery of a concept, then offer no further testing. What we were developing was a "proficiencies" test.
With a proficiencies-type test, staff is able to introduce concepts early in the year and monitor the student's rate of learning, without necessarily expecting all areas mastered until the end of the school year. This way, even if students achieve mastery early in the instructional year, we continue measuring it to ensure they retain the information.
After consensus was built at the district level, each teacher was given a chart showing all reading and math standards for his or her grade level, when each standard was to be introduced, and when during the year they would begin being tested.
As the year progressed and the test started coming together, we were able to move to grade-level planning meetings.
The grade level meetings took place just before and after each of the quarterly administrations.
The test item bank RUSD purchased is called "The Curriculum Director," from Bookette Software. The program allows users to select test questions by grade level in the areas of reading and math. All items are multiple choice and you can customize the test to be as long or brief as you need.
Although the item bank contains thousands of test questions, we found many questions were not truly linked to California standards. Additionally, some grade levels had more test items than others, dictating a need for us to generate and field test our own questions. Our staff estimates that up to 25 percent of our finished assessment instrument was developed on our own.
The timeline for test construction stretched over an eight-month period and was slowed by the absence of a user manual being provided by the software publisher.
The final product
The final product is a multiple-choice format test. Second through eighth grade students read the questions from a test booklet and mark their answers on a scanable answer sheet. First graders are allowed to answer directly in the booklet, then teachers transcribe student answers onto the answer sheet.
At the time of test construction, our district personnel decided to develop three versions of the assessment so that we could run a three-year cycle before the same test was seen again. Although these parallel versions may not have been a necessity, we knew we only had our teacher on special assignment for one year, so it was better to go the extra mile while we had the resources.
Not wanting to overwhelm either the teaching staff or the students taking the test, a great deal of discussion was given to the amount of time we would devote to testing. After extensive piloting, our staff felt we could gather enough information about our students' progress in the curriculum by conducting four evaluations per year.
Each proficiency-testing window occurs during a four-day period two weeks prior to the end of each grading quarter. Each of the quarterly measures contains different test items to minimize any effects of memory.
Using the data
At the district office level we may be able to put great amounts of time, money and effort into a project, but if teachers can't use the data, the project has no value. Therefore, providing meaningful information continues to be our top priority.
During this project's pilot phase, district personnel provided printouts to each teacher showing individual student scores and their level of mastery on the reading and math content standards. We also provided a summary to each principal allowing them to monitor the achievement level of all classrooms on their campuses. The "Curriculum Director" software mentioned earlier generates these printouts.
Teachers and principals have reported overall satisfaction with this level of feedback. We will continue to provide these reports within two weeks of the test administration, just in time for parent conferences and report cards.
With the requirement of writing the Single School Plan For Student Achievement, our curriculum staff knew this was a great opportunity to use local assessment to modify instruction and improve student achievement. In order to accomplish this feat, reports that could not be done with the Bookette software program needed to be developed.
The RUSD curriculum staff put together a data summary sheet to fill this need. The summary is called the "Data Analysis Table" and shows to what level students have mastered standards by grade level for each school. This way, grade level teams can meet and determine what, if any, changes need to be made on their particular campus. This school and grade level specific feedback is given each quarter, right after the test results are calculated. Teachers and administrators agree the most valuable information they receive from the proficiency testing is the School Data Analysis Table. Attached to the School Data Analysis Table is a one page summary of how to use and interpret the information. It states:
"The Data Analysis Table has several components; on the left side of the report you will see the reported domains (such as reading, writing, algebra, etc.). Under the particular domain, you will see the strands (e.g. under the reading domain you may see the strands of word analysis, reading comprehension, literary response, etc.). Finally, under each strand you see multiple substrands (e.g., under the reading comprehension strand you may see substrands of prior knowledge, topic and central idea, conclusions and inference, etc.).
"For each substrand tested, a score will be reported for that particular quarter. This score is unique to each individual school and grade level. The score reported is the percentage of students at your school and at your grade level who scored 100 percent mastery on that substrand. Each score receives a ranking from 1 to 5.
"When your grade level reviews the results of the proficiency testing, you should always put together an action plan for the areas with the lowest scores/levels. Working as a grade level in conjunction with your principal, you will need to decide (in your action plan) what you need to do to raise student performance."
Feedback leads to intervention plans
More than 24 months have passed since we began our process of aligning instruction to the California Content Standards based on local assessment. We are now able to give teachers prompt feedback on their students' achievement in a fashion that allows them to develop an intervention plan for the next grading quarter. The software we purchased was not expensive, but needed quite a bit of fine-tuning. It was critical to involve both the teaching and administrative staff members on how and why changes were being implemented.
For other districts thinking of following a similar route as Rosedale, the best advice you could take is to place a teacher on special assignment who has a high level of technical skill and is familiar with the standards. Our final result is data-based decision making for more effective instruction.
Thomas Ewing is coordinator of the curriculum department for the Rosedale Union School District Bakersfield.…