By Gaither, Herman
Technology & Learning , Vol. 26, No. 2
This summer, educators gathered in Seattle to reflect on the tenth anniversary of Anytime, Anywhere Learning, Microsoft's initiative to put laptops into the hands of middle and high school students. The pioneers of this program, of which I was one, aimed to dramatically alter instruction, empower teachers, and engage students.
These days, however, the emphasis has shifted from using technology for instruction to employing it to assess, track, mine, and present data. I agree that those applications are necessary to meet new requirements such as No Child Left Behind, but it concerns me that they are becoming the dominant use of technology in education. Data-driven decision making, while important, should not diminish the use of technology in classrooms or drain scarce resources from instructional technology. After all, you do not make a pig fat by weighing it.
So how did we get ourselves into this predicament? Let's start at the beginning. In the mid-'90s, schools were anxious to get powerful new technologies into the classroom. Unlike the previous generation of instructional tools, these technologies had the potential to redefine basic interactions between students and teachers. They offered 24/7 access to information and empowered students with previously unavailable research resources. They pushed educators to redesign instruction in order to connect with students of the digital generation.
By the same token, technology providers quickly formed partnerships with districts to create appropriate offerings. In many cases, drill-and-kill products were replaced by tools that encouraged the development of higher-order thinking skills. Likewise, the more cutting-edge schools altered traditional staff training in order to equip teachers with the skills to reinvent instruction.
My former school district was an early participant in this adventure. We developed a technology plan, built our infrastructure, and trained staff. …