Academic journal article Journal of Research Administration

Practice Data from the 2002 SRA-Bearing-Point Nationwide Benchmarking Survey

Academic journal article Journal of Research Administration

Practice Data from the 2002 SRA-Bearing-Point Nationwide Benchmarking Survey

Article excerpt

Introduction

Benchmarking has evolved over the past 20 years into a powerful tool for performance analysis and total quality management. Its concept is simple: if you want to know how well your organization is doing at some task or function, you need to know how well others are doing at the same task or function. Benchmarking has been defined as "the systematic comparison of elements of the performance of an organization against that of other organizations, with the aim of mutual improvement." (McNair and Leibfriend, 1992)

In his book Thriving on Chaos, Tom Peters wrote: "... the term 'what gets measured gets done' has never been so powerful a truth." (Peters, 1987) Benchmarking has been embraced by many companies and industries. Companies have seen the value of benchmarking in assessing their competitive positions and adopting "best practices," which improve outcomes and bottom lines. By contrast, educational and non-profit sectors have been slow to adopt the metaphors and methods of benchmarking, especially in the management and administration of research and other externally sponsored activities. By providing consistent and independent benchmarking and an objective forum for doing comparative analysis, the SRA-BearingPoint Sponsored Programs Benchmarking Program provides the opportunity for institutions to achieve the benefits of benchmarking for very little cost and effort.

The general approach to benchmarking is captured in Figure 1. Benchmarking is a cyclical process using the following sequence of steps: (1) define a domain of key organizational activity (e.g. winning competitive research awards), (2) Identify either via quantitative performance data or reputational information the "best practitioners" in that domain, (3) document and describe in operational detail the practices of "best practitioner" institutions, (4) disseminate the findings to other organizations who can then replicate the practices of their more developed peers, and finally (5) use the information gained to identify different areas of interest for future efforts.

[FIGURE 1 OMITTED]

The SRA--BearingPoint Sponsored Programs Benchmarking Program has been in place since 1998. Three rounds (FY 1998, FY 2000, and FY 2002) of data collection focused on institutional sponsored research competitiveness, administrative efficiency, productivity, and organizational practices. A nationwide sample of academic and nonprofit institutions representing over 40% of total U.S. academic research expenditures provided the data. The database is available to participating institutions using a web-based reporting and analysis tool. This reporting system allows participants to customize and generate institution-specific peer comparisons in a variety of tabular and graphical formats.

In the FY 2002 survey, the Program began to move from focusing primarily on quantitative performance data to a balance between performance data and qualitative practice data. For the first time, we asked a number of practice questions that identify the current state of practice in three domains of sponsored programs administration:

1. Encouraging and facilitating faculty participation in research and other externally sponsored activity,

2. Preparing and developing sponsored programs administrative staff; and

3. Decentralizing responsibility and authority for sponsored programs administration functions.

Participants can couple information about performance to practices, determine, how their practices in these domains compare to those of other participants, and identify other participants whose practices may be worth adopting.

In previous reports, we described the development of the SPA Benchmarking Program, the inclusion of a broader community of participants as independent research institutes were added, the development of a web-based data collection and analysis system, and the results of the 1998 and 2000 surveys (Kirby and Waugaman, 2000-03). …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.