Magazine article American Libraries

Data-Power to the People; CD-ROM Gives Statistical Research Power to All Libraries and Challenges Librarians to Produce Answers, Not Sources

Magazine article American Libraries

Data-Power to the People; CD-ROM Gives Statistical Research Power to All Libraries and Challenges Librarians to Produce Answers, Not Sources

Article excerpt


WHEN LIBRARIANS SUPPLEMENT the printed page, "the eye-readable," with machine-readable materials, these materials become new versins of the book, more compact and more durable. But when the machine-readable formats consist of numbers, tables, and charts instead of text, they require specialized skills and different methods of handling.

Emily Gallup Fayen opened a recent article in American Libraries by alluding to what has become an almost mind-numbing outpouring of predictions on the library of the future. She resisted the temptation to add her own. Instead, she treated the profession to an eloquent and thoughtful examination of how we must reorient our approach to service in order to continue as valued professionals in the environment technology is shaping for us.

The issues Fayen raised are of interest to all librarians. However, we at the University of Georgia Libraries felt she was speaking directly to us. Already serving a major research university with impressive collections of books, journals, microforms, and media, the University of Georgia Libraries early in 1986 assumed responsibility for collecting, accessing, and archiving a large, heavily used collection of machine-readable databases. The information on the hundreds of reels of magnetic tape in the collection is accessed through some of the most advanced mainframe computers available and represents an invaluable addition to the social sciences and business. The Libraries' capabilities made a quantum leap virtually overnight. As we worked to fulfill the exciting potential of this collection, we discovered that the same resource will be available to virtually every library, within months!

Data sets proliferate

The U.S. Bureau of the Census, developer of the now disappearing punch card, began using a computer to process the 1950 census and first released tapes to the public with the 1970 census. During this period, an organization now known as the Inter-university Consortium for Political And Social Research (ICPSR) at the University of Michigan began to collect, archive, and make numeric data sets available to individual researchers through their member institutions. Early on, these collections consisted largely of election data and major surveys of political attitudes. Today, the ICPSR offers databases ranging from records of U.S. farm real estate values and surveys of consumer finances to censuses of religious bodies and perceptions of youth from high school into early adulthood. JoAnn Dionne's recent survey of numeric social science databases provides an excellent overview and a useful bibliography.

As the Bureau of the Census, the ICPSR, and other sources expanded the availability of machine-readable data sets, social scientists radically altered the way they did research. With vast and growing amounts of demographic and survey data already amassed and in a format readily interpretable by computers, fewer researchers collected their own data. Many social scientists began to rely more on these data collections and less on printed materials in the library. After all, who would spend hours transcribing figures from printed volumes of the World Bank Tables and calculating correlations when the same machine-readable data could be processed by a few commands to the computer?

Unfortunately, until now there was no inexpensive or efficient way for libraries (or anyone else) to handle these immense data sets. Though magnetic tapes are a significant improvement over punch cards, they typically have been stored in computer centers because they had to be processed on large, expensive machines. Programmers were needed for even the simplest of operations, and seemingly endless delays resulted when researchers failed to communicate with programmers, who then made trial run after trial run to correct inevitable errors.

With the exception of pioneering efforts such as those at Yale, the University of Florida, and the University of British Columbia, most research libraries have left it up to computer centers to deal with the machine-readable materials. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.