The University in the Twenty-First Century

Article excerpt

In his essay on contemporary higher education, John Tagg argues that colleges and universities, the engines of progress of the contemporary knowledge society, are failing badly and are not meeting the need for their radical transformation. This author is in agreement with Tagg that much reform is needed. However, I am far less skeptical concerning the ability of American colleges and universities to transform themselves. Although they have often been characterized as "ivory tower" institutions and remote from the "real world," over the centuries they have in fact responded to the needs of a changing marketplace. There is no reason to believe that they will cease to do so in the information age.

One of the most important functions of the university has always been to train personnel for knowledge-intensive professions in the labor market. In eleventh-century Bologna, that market consisted of the two most important institutions of the time, church and state. Not surprisingly, the only courses of study initially offered by the University of Bologna were canon and civil law. In eighteenth-century Prussia, the state came to regard inherited feudal status as a woefully inadequate basis for the appointment of public officials. It required a professionally trained corps to hold public office. In addition to the administrative and judicial bureaucracy, these included teachers and professors in state-controlled schools and universities and the ecclesiastical hierarchy of the established Lutheran Church. The academic offerings of Prussian universities expanded to meet this demand.

Apart from law, medicine, and theology, professional and scientific training did not become an important part of the university's mission until late in the nineteenth century. Germany was the first country to respond to the rapid industrialization that was taking place in western Europe and North America. It created the university as a complex of graduate schools performing advanced research and experimentation that became the worldwide model.

GROWTH OF THE MODERN UNIVERSITY

The rationalization of agriculture and the growth of industry in the second half of the nineteenth century were principal factors in the expansion of the modem American university system. In 1862 Lincoln signed the Morrill Act granting thirty thousand acres of land for each representative and senator "for the endowment, support and maintenance of at least one college ... to teach branches of learning as are related to agriculture and mechanical arts." The act provided the basis for the extraordinarily successful American land grant system of agricultural education and research without excluding "scientific and classical studies." Many of the land grant colleges have become great universities. These include Ohio State University, Michigan State University, Cornell, and the Universities of Maryland, Georgia, Florida, Wisconsin, Illinois, and West Virginia.

With its insatiable demand for ever more effective weapons systems, war has contributed enormously to the growth of the modern university system. In World War I, chemists and physicists worked on weapons development in government laboratories. In World War II, the government contracted its weapons research projects to the universities themselves. Physicists first produced plutonium at the University of California at Berkeley. On December 2, 1942, the first man-made, self-sustaining nuclear reaction was achieved in a squash court beneath the unused football field of the University of Chicago. The war also created a heightened demand--largely met by American colleges and universities--for economists, sociologists, demographers, political scientists, psychologists, managerial experts, historians, cryptographers, professionals skilled in foreign languages, and, in general, those who possessed knowledge concerning both Allied and enemy countries.

After the war, returning veterans and the Cold War provided the basis for the further expansion of the university system. …