Benchmarking in Sponsored Programs Administration: Using the Web to Analyze Nationwide Data Collection Results. (Shop Talk)

By Kirby, William S.; Waugaman, Paul G. | Journal of Research Administration, April 2002 | Go to article overview
Save to active project

Benchmarking in Sponsored Programs Administration: Using the Web to Analyze Nationwide Data Collection Results. (Shop Talk)


Kirby, William S., Waugaman, Paul G., Journal of Research Administration


Introduction

The Society of Research Administrators International (SRA), the National Association of College and University Business Officers (NACUBO), and the Higher Education Practice of KPMG Consulting, Inc. have jointly undertaken the development of a national benchmarking program. This program has two purposes: (a) to provide participating institutions tools for quantitative analysis of their activities and comparison data and (b) to provide the sponsored programs administration community with performance and practice benchmarks to aid training and development programs.

Two rounds (FY 1998 and FY 2000) of complete data collection focus on institutional sponsored research competitiveness, administrative efficiency, productivity, and organizational practices. The resulting database includes a nationwide sample of academic and non-profit institutions, representing over 40% of total U.S. academic research expenditures. Data are available to participating Institution using a Web-based reporting and analysis tool. This reporting system allows participants to customize and generate institution-specific peer comparisons in a variety of tabular and graphical formats. This brief describes the measures and refers participant institutions to the Web page that allows them to make online comparisons.

Results from the FY 1998 and FY 2000 national surveys are widely available. Visit the SRA International Benchmarking Web page or the Research Management study area at www.higheredbenchmarking.com.

The Need for Customized Reporting

As a result of experience gained during the first round of data collection and feedback, the study leaders moved to the World Wide Web to make the data collection process more efficient and to provide participants with more flexibility and control over the reporting process. A powerful reporting tool is available for the study participants. The following sections describe the data elements and illustrate how the tool can be used to analyze comparative data.

Data Elements and Variables

The following sections describe the various data elements, pre-defined variables, and predefined comparison groups in the system. Refer to the survey definitions for the description of each element and its inclusions and exclusions. See Table 1 for key data definitions.

Indicators and Variables. The performance indicators are organized around the four themes: (a) sustaining or enhancing sponsored projects activity and funding, (b) containing the costs and improving the efficiency of sponsored projects administration, (c) improving administrative services to faculty, and (d) maintaining and improving institutional and sponsor accountability. Table 2 lists the demographic comparison groups for analysis.

Examples of Results

Figures 1 and 2 illustrate the types of analyses that can be done using the Web-based reporting tool. These examples were produced using data from the FY 1998 and early data from the FY 2000 survey. The graph in the first figure plots a hypothetical participant's FY 1998 and FY 2000 data on the number of proposals submitted per 100 faculty FTE reported for each year. The participant shows higher performance than the mean values of all participants and also for comparison groups of the NSF top-100 universities in the sample.

The graph in the second figure plots direct data, not ratio measures. It compares the participant's staffing levels in post-award financial administration with the mean staffing levels of other participants and in the NSF top-100 comparison group. This slide shows a relatively large post-award financial administration staff compared to the means of comparison groups.

The FY 1998 survey yielded some interesting results with regard to competitiveness and cost and efficiency. Generally, the survey results confirmed the conventional wisdom that sponsored research administration at larger, more research-intensive institutions appear to be more cost effective, having generally higher median levels of proposals and projects per sponsored program administration employee (FTE) or operating dollar.

The rest of this article is only available to active members of Questia

Sign up now for a free, 1-day trial and receive full access to:

  • Questia's entire collection
  • Automatic bibliography creation
  • More helpful research tools like notes, citations, and highlights
  • Ad-free environment

Already a member? Log in now.

Notes for this article

Add a new note
If you are trying to select text to create highlights or citations, remember that you must now click or tap on the first word, and then click or tap on the last word.
Loading One moment ...
Project items
Notes
Cite this article

Cited article

Style
Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited article

Benchmarking in Sponsored Programs Administration: Using the Web to Analyze Nationwide Data Collection Results. (Shop Talk)
Settings

Settings

Typeface
Text size Smaller Larger
Search within

Search within this article

Look up

Look up a word

  • Dictionary
  • Thesaurus
Please submit a word or phrase above.
Print this page

Print this page

Why can't I print more than one page at a time?

While we understand printed pages are helpful to our users, this limitation is necessary to help protect our publishers' copyrighted material and prevent its unlawful distribution. We are sorry for any inconvenience.
Full screen

matching results for page

Cited passage

Style
Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited passage

Welcome to the new Questia Reader

The Questia Reader has been updated to provide you with an even better online reading experience.  It is now 100% Responsive, which means you can read our books and articles on any sized device you wish.  All of your favorite tools like notes, highlights, and citations are still here, but the way you select text has been updated to be easier to use, especially on touchscreen devices.  Here's how:

1. Click or tap the first word you want to select.
2. Click or tap the last word you want to select.

OK, got it!

Thanks for trying Questia!

Please continue trying out our research tools, but please note, full functionality is available only to our active members.

Your work will be lost once you leave this Web page.

For full access in an ad-free environment, sign up now for a FREE, 1-day trial.

Already a member? Log in now.

Are you sure you want to delete this highlight?