An Analysis of Cataloging Copy: Library of Congress vs. Selected RLIN Members

Article excerpt

An Analysis of Cataloging Copy: Library of Congress vs. Selected RLIN Members

In January 1987 the Cataloging Unit of the Albert R. Mann Library, Cornell University, had a backlog of more than 5,000 monographs. The acquisitions rate in Mann Library had increased 75 percent in the previous two years and the cataloging backlog, reflecting this increase, had doubled. In order to stem the growth of the backlog and perhaps even reduce it, the cataloging staff of the library felt that our methods for handling cataloging copy should be analyzed. Traditionally, current Library of Congress copy had been handled by support staff and original catalogers handled member-contributed copy and original cataloging. The most significant growth in the backlog had been in the original/member backlog.

The newly appointed head of Technical Services and two of the original catalogers decided to examine the quality of member-contributed copy and the feasibility of giving Research Libraries Information Network (RLIN) member copy to catalogers who work with Library of Congress (LC) copy. At the same time, we would institute, for certain categories of books with LC copy, a pilot procedure of cataloging on receipt in the Acquisitions Unit.

For many years, Mann Library, like many other libraries, had maintained a list of preferred member libraries. This list had not been drawn up by any rigorous study, such as that used by Wing. [1] Instead it had been compiled informally, based on the cataloging staff's day-to-day experience evaluating cataloging copy. All the libraries on this list were large academic research institutions with subject strengths similar to Mann Library's. Their cataloging was judged to correspond to ours in adherence to nationally accepted standards, level of description, accuracy of classification, and completeness and specificity of subject analysis. We decided to use our list and test whether the "best" member copy came close to meeting the standards of LC copy. If it were different, we would see how it varied and how we could adjust our training to accommodate it. From the beginning we knew that any such test of cataloging copy could only be made with the cooperation and commitment of the cataloging staff. Although it was somewhat difficult to engage in research while the backlog grew, the cataloging staff was eager to devise a new pattern for technical processing and willingly participated in the study.


A search of the published literature yielded several studies conducted with the aim of evaluating the quality of member copy cataloging. In 1978, Ryans published a study that analyzed the types of errors found in records contributed to OCLC and the fields most frequently affected. [2] While Ryans looked primarily at the major variable fields (main entry, title, edition, imprint, collation, series, subject headings, and added entries), in a 1981 study of Online Computer Library Center (OCLC) copy Hudson [3] also examined revisions in the fixed fields and tagging. In the first study 60% of 700 records were considered acceptable with no changes. The second study involved 1,017 records, of which approximately 60% required some revision. The quality of Research Libraries Information Network (RLIN) member copy was examined by Wing [4] primarily for the purpose of establishing a "preferred order list" of libraries with acceptable cataloging to be used by copy catalogers. Indeed, many libraries use such "white lists" but little has been published about how these lists are determined. [5] The accuracy of Library of Congress cataloging copy, including CIP-based copy, was examined by Taylor and Simpson in 1983. [6] They found very few errors per record on the whole (47% were error free; only 27% had two or more errors or discrepancies), concluding that LC copy is relatively high in quality regardless of origin.


We were fortunate at Mann Library to be able to draw upon the resources of the Biometrics Unit in the College of Agriculture and Life Sciences. …


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.