The Higher Education Technology Revolution

Article excerpt

Nowhere in higher education has there been as much change as in the use of information technology. Not only have information technology advances provided institutions useful tools like personal computers and campus computer networks, but changes spurred by information technology (IT) advances are leading to a transformation of higher education.

"[Information technology] has shaken up the bigger picture of higher education," says Jim Duke, an IT manager at St. Paul's College in Virginia, noting that institutions are competing against one another on the basis of the wealth and use of their IT facilities and resources.

This transformation is shifting the very nature of how higher education institutions are communicating with and educating students. Professors are using the Internet as a communications medium to enrich the curriculum, and to enliven the exchange and discussion between them and their students.

With advanced technology, dozens of colleges and universities as well as corporate America are expanding the capacity of American higher education with distance education facilities. Information technology is making it possible for faculty to teach students who are living far away from central campuses.

Technologies, such as satellite videoconferencing, Internet-based teleconferencing, and interactive multimedia classrooms, are giving schools the ability to reach and educate nontraditional students in numbers that will multiply the capacity of American higher education. Lifelong learning expectations among adults are expected to drive this expansion of distance education.

While experts have largely cheered information technology advances, advocates for minority students are warning that exclusion to new information technologies has the potential to foster deep class divisions between people who have access to information technology and those who do not have it. Government officials and activists report that a "digital divide" is growing with minorities comprising a large segment of those lacking basic access to computers and the Internet.

Black Issues In Higher Education presents 15 developments, which have laid the foundation for the current information technology environment. These include some specific technical innovations as well as broader trends, which encompass use of several technologies.

1) Introduction of the Apple MacIntosh personal computer on campus

In 1984, Apple Computer introduced Macintosh and marketed it heavily to colleges and universities. What distinguished Macintosh from previous other PCs was that it was the first major PC to utilize the Graphical User Interface (GUI) -- the visual display of objects, which allow users easy use of programs. As a result, Apple PCs became highly popular among students, faculty, and administrators. In addition to its popularity, the Macintosh GUI served as the precursor to the Microsoft Windows GUI.

2) Emergence of the Internet

Although the Internet was developed largely by American academic researchers backed by the U.S. Defense Department in the 1960s and 1970s as a computer network capable of withstanding nuclear attack, it wouldn't become a commercial and popular network until the 1990s. The Internet also found widespread popularity on college and university campuses in the 1990s only after critical innovations made it convenient to access and use.

3) Computer Chip Innovation

Over the last 15 years, computers have grown cheaper and more powerful as a result of improvements in computer chip technology. In the 1970s and 1980s, the concept of very large scale integration (VLSI), in which hundreds of thousands of transistors were placed on a single chip, took root as the path for developing better and better microcomputers. Microcomputers eventually became known as personal computers, and innovation in developing microprocessors, or computer chips, has continued unabated. …