Beyond Citation Analysis: A Model for Assessment of Research Impact
Sarli, Cathy C., Dubinsky, Ellen K., Holmes, Kristi L., Journal of the Medical Library Association
Question: Is there a means of assessing research impact beyond citation analysis?
Setting: The case study took place at the Washington University School of Medicine Becker Medical Library.
Method: This case study analyzed the research study process to identify indicators beyond citation count that demonstrate research impact.
Main Results: The authors discovered a number of indicators that can be documented for assessment of research impact, as well as resources to locate evidence of impact. As a result of the project, the authors developed a model for assessment of research impact, the Becker Medical Library Model for Assessment of Research.
Conclusion: Assessment of research impact using traditional citation analysis alone is not a sufficient tool for assessing the impact of research findings, and it is not predictive of subsequent clinical applications resulting in meaningful health outcomes. The Becker Model can be used by both researchers and librarians to document research impact to supplement citation analysis.
A traditional method of assessing research impact, citation analysis is performed by examining an individual publication and assessing how often it has been cited, if ever, by subsequent publications . It is a tool for gauging the extent of a publication's influence in the literature and for tracking the advancement of knowledge with the inherent assumption that significant publications will demonstrate a high citation count [2-4]. While citation analysis is subject to some flaws, such as self-citing and reciprocal citing by colleagues [5, 6], it is accepted as a standard tool for assessing the merits of a publication.
In May 2007, a principal investigator from the Ocular Hypertension Treatment Study (OHTS)  requested a citation analysis of OHTS articles after viewing a poster by Sieving, "The Impact of NEIFunded Multi-Center Trials: Bibliometric Indications of Dissemination, Acceptance and Implementation of Trial Findings" , which had been presented at the 2007 meeting of the Association for Research in Vision and Ophthalmology. The authors performed a citation analysis of twenty-six OHTS peer-reviewed articles using the SCOPUS, Web of Science, and Essential Science Indicators databases.
Of the twenty-six journal articles, several demonstrated high rates of citation. That is, they were often cited by subsequent publications. In some instances, the citations exceeded baseline citation rates as noted on Essential Science Indicators, a database that assesses intellectual impact by analyzing the citation rate for a publication against other publications in a particular area of research or multiple areas of research.
The high citation rates for the selected journal articles sparked the authors' interest in further investigation to determine why these articles were cited so often by other peer-reviewed journal articles. Was this indicative of significant findings that might have resulted in clinical outcomes? If so, what were those outcomes and how could they be revealed? What other evidence of research impact could the authors discover by going beyond citation analysis?
A cursory search of the Google and Yahoo search engines using key terms related to OHTS (i.e., open angle glaucoma, pachymetry, central corneal thickness, ocular hypertension) and the title of the study (i.e., OHTS, Ocular Hypertension Treatment Study) yielded findings such as practice guidelines, continuing education guidelines, curriculum guidelines, insurance coverage documents, and quality measures guidelines that noted OHTS as supporting documentation. Further analysis using other resources, such as government websites, revealed additional evidence of research impact attributed to OHTS findings. These materials are not usually indexed by databases, nor are they consistently noted as cited-by publications. Given the depth of evidence of research impact that was not revealed from citation analysis alone, the authors decided to perform a more systematic and comprehensive evaluation. …