Benchmarking in Sponsored Programs Administration: Using the Web to Analyze Nationwide Data Collection Results

Article excerpt

Shop Talk

Introduction

The Society of Research Administrators International (SRA), the National Association of College and University Business Officers (NACUBO), and the Higher Education Practice of KPMG Consulting, Inc. have jointly undertaken the development of a national benchmarking program. This program has two purposes: (a) to provide participating institutions tools for quantitative analysis of their activities and comparison data and (b) to provide the sponsored programs administration community with performance and practice benchmarks to aid training and development programs.

Two rounds (FY 1998 and FY 2000) of complete data collection focus on institutional sponsored research competitiveness, administrative efficiency, productivity, and organizational practices. The resulting database includes a nationwide sample of academic and non-profit institutions, representing over 40% of total U.S. academic research expenditures. Data are available to participating institutions using a Web-based reporting and analysis tool. This reporting system allows participants to customize and generate institution-specific peer comparisons in a variety of tabular and graphical formats. This brief describes the measures and refers participant institutions to the Web page that allows them to make online comparisons.

Results from the FY 1998 and FY 2000 national surveys are widely available. Visit the SRA International Benchmarking Web page or the Research Management study area at www.higheredbenchmarking.com.

The Need for Customized Reporting

As a result of experience gained during the first round of data collection and feedback, the study leaders moved to the World Wide Web to make the data collection process more efficient and to provide participants with more flexibility and control over the reporting process. A powerful reporting tool is available for the study participants. The following sections describe the data elements and illustrate how the tool can be used to analyze comparative data.

Data Elements and Variables

The following sections describe the various data elements, pre-defined variables, and predefined comparison groups in the system. Refer to the survey definitions for the description of each element and its inclusions and exclusions. See Table I for key data definitions.

Indicators and Variables. The performance indicators are organized around the four themes: (a) sustaining or enhancing sponsored projects activity and funding, (b) containing the costs and improving the efficiency of sponsored projects administration, (c) improving administrative services to faculty, and (d) maintaining and improving institutional and sponsor accountability. Table 2 lists the demographic comparison groups for analysis.

Examples of Results

Figures 1 and 2 illustrate the types of analyses that can be done using the Web-based reporting tool. These examples were produced using data from the FY 1998 and early data from the FY 2000 survey. The graph in the first figure plots a hypothetical participant's FY 1998 and FY 2000 data on the number of proposals submitted per 100 faculty FTE reported for each year. The participant shows higher performance than the mean values of all participants and also for comparison groups of the NSF top-100 universities in the sample.

The graph in the second figure plots direct data, not ratio measures. It compares the participant's staffing levels in post-award financial administration with the mean staffing levels of other participants and in the NSF top- 100 comparison group. …