By Bugeja, Michael
The Futurist , Vol. 42, No. 1
In the midst of the consumer technology boom of 1999--the diffusion of cell phones, laptops, music players, and gaming consoles--I began work on the concept of the "interpersonal divide," or the social void that I observed developing as more people came to rely on mediated rather than face-to-face communication.
My warnings went unheeded because of the hoopla over the global village, particularly in academia, which was investing billions of dollars in information technology to facilitate the rapid growth of computing on campus. Since I direct a journalism school, my warnings about corporate profit at the expense of public institutions worried colleagues as well as benefactors and media practitioners.
The new technologies that now keep us constantly connected also keep us constantly distracted. Educators know that wireless technology has disrupted the classroom, with students browsing (and even buying) online during lectures. However, the new challenge is the pervasive unwillingness to do anything about it.
Digital distractions now keep us from addressing the real issues of the day. Each of us daily consumes an average of nine hours of media through myriad technological platforms. As a journalism professor, I'm especially sensitive to this emerging state of constant distraction and its effects on what we watch and read. This is not the Age of Information. This is the Age of Distraction. And distraction in academia is deadly because it undermines critical thinking. That impacts all of us--and the future.
Without critical thinking, we create trivia. We dismantle scientific models and replace them with trendy or wishful ones that are neither transferable nor testable. We have witnessed this with such issues as global warming, worldwide pandemics, and natural selection. Thus, I theorize that standards of higher education have been lowered, not raised, because of new information and consumer technology.
For more than a decade now, university administrators have been touting technology. Apple, Dell, Gateway, and Gates persuaded us to become citizens of a brave new media world that promised to enfranchise and enlighten everyone with universal access. Now, access is omnipresent in our wireless campuses and workplaces. Was that investment well spent? The U.S. Department of Education saw no difference between the performance of kids who used academic software programs for math and reading and those who did not.
In fact, reading scores in 2005 were significantly worse than in 1992, according to the National Assessment of Educational Progress, the nation's report card. And in math, only 23% of all twelfth graders were proficient. Worse, these sinking scores occurred even though high-school students averaged 360 more classroom hours in 2005 than in 1990.
We need to investigate whether distractions in wireless classrooms might be to blame. Have we compared the scores of school districts investing modestly versus heavily in technology, adjusting for factors like household income, to see if our digital classrooms make any difference?
Assessment no longer is the norm in higher education. Worse, universities are investing in online virtual worlds vended by companies whose proprietary service terms often conflict with disclosure and due process. Costs keep mounting, from bandwidth to security. We have trouble funding real campuses without leasing and staffing digital "land" that is not really there.
The question at issue is who will be the bearer of truth in the digital age--the professor or the processor? To answer that, we first must dispel the myth that technology is a tool, depending on how one uses it.
Technology, in fact, is an autonomous system. As philosopher Jacques Ellul (1912-1994) foresaw, technology changes dramatically whatever it touches. Introduce technology into journalism, and henceforth journalism is about technology. …