Until very recently, discussions of the crisis in U.S. education centered exclusively on public schools in the United States; higher education has been viewed as almost trouble-free -- the best in the world. Today, however, several problems loom. Many states' budgets for higher education are falling, and some are poised for truly terrifying reductions even as student populations are increasing and becoming more diverse (in age as well as ethnic composition). At the same time, even as resources are shrinking, industries and consumers are demanding more of colleges. Further, a broad shift from manual workers to "knowledge workers" means that students will require more education; high school diplomas no longer guarantee good job prospects. And many predict that job skills will need updating every few years; if so, "lifelong learners" will continue to demand education and retraining throughout their careers. All these changes are straining institutions of higher education to the breaking point.
Information technologies have played a vital role in higher education for decades. Television started sending instruction to campuses and homes during the 1950s (remember Sunrise Semester?); before that, radio and film were used in a wide range of courses; and computers have populated labs in schools since the late 1960s. But only recently has interest in educational applications of information technology, which now includes the Internet and the World Wide Web, reached nearly universal proportions. In the past, discussion of educational technology was limited mainly to academic and teaching journals; now, almost every major newspaper has devoted at least a series of articles or a Sunday supplement to "Learning in Cyberspace," touting technology as a savior for education.