Federal Performance Reporting: What a Difference Ten Years Males! since CPRA Implementation, the Annual Reporting Quality Has Improved Substantially, and Agencies with Better Reports Increasingly Use Performance Information as a Management Tool

Article excerpt

In 1999--a decade ago--federal agencies issued their first annual performance reports mandated by the Government Performance and Results Act (GPRA) of 1993. That same year, the Mercatus Center at George Mason University launched a research project to evaluate the quality of the reports each year. Ten years makes a difference:

* Ten years ago, one agency's report almost filled a copier paper box and weighed 30 pounds. For fiscal year 2008 (FY08), twenty agencies produced brief "citizens' reports" to make their performance results more accessible to the general public; most of these reports were shorter than thirty pages.

* Ten years ago, sixteen agencies made their reports available online, but only four clearly labeled the report as their annual performance report and made it easy to find. For FY08, twenty three reports were available online shortly after they were due to Congress. Thirteen agencies posted their reports on time, created a direct link on their homepage, permitted downloads as both single and multiple files, and provided contact information for questions or comments.

The content of the reports has improved substantially as well, both quantitatively and qualitatively.

By the Numbers

Each year, our researchers examine the reports produced by the twenty-four agencies covered under the Chief Financial Officers (CFO) Act of 1990, which account for the vast majority of federal outlays.

The scoring process evaluates (1) how transparently an agency discloses its successes and failures, (2) how well an agency documents the tangible public benefits it claims to have produced, and (3) whether an agency demonstrates forward-looking leadership that uses annual performance information to devise strategies for improvement. An expert team evaluates each report on twelve criteria--four each for transparency, public benefits, and leadership. On each criterion, the report receives a score that can range from 1 (no useful content) to 5 (best practice that other agencies should adopt). The maximum possible score is 60, and the minimum is 12. An average of 3 points on every criterion yields a score of 36, which could be considered "satisfactory."

Figure 1 shows that average scores have risen by about 15 percent. Some individual agencies improved their reports by much larger amounts. The U.S. Department of Labor's report was the most improved over the decade, rocketing from 36 points and fifth place in FY99 to a record 56 points and first place in FY08. Seven agencies gained more than 10 points between FY99 and FY08: the Nuclear Regulatory Commission (NRC), National Science Foundation (NSF), and the Departments of State, Health and Human Services (HHS), Commerce, Justice, and Agriculture. The Department of Homeland Security also increased its score by 13 points since FY04, the first year its report was included in the scorecard.

[FIGURE 1 OMITTED]

These score data understate the full extent of improvement, however, because the research team tightens the scoring criteria over time as new best practices emerge. Our reevaluation of the best four reports from FY99 finds that these reports would rank well below average when judged on the same twelve criteria, but using FY08's higher standards.

Table 1 shows where these FY99 reports would have ranked compared with the reports agencies produced for FY08. Evaluated by FY08 standards, the best FY99 report, from the U.S. Agency for International Development (USAID) would rank just sixteenth in FY08. The three other FY99 reports reevaluated under FY08 standards were from the Departments of Transportation, Veterans Affairs, and Education. All would have done worse under FY08's tighter standards. From these evaluations, we estimate that the average quality of performance reports has improved by about 75 percent since FY99.

Best Practices

Because the Mercatus scorecard explicitly identifies best practices every year, comparing some FY99 best practices with their FY08 counterparts is not difficult. …