Press Release

Second Annual Performance Report Scorecard: Which Federal Agencies Inform the Public? – Mercatus Center at George Mason University

By SpaceRef Editor
May 22, 2001
Filed under ,

  • Full report, Mercatus Center at George Mason University



    Executive Summary

    Federal agencies’ annual performance reports should give Congress and the public accurate, timely information documenting the tangible public benefits the agencies produce. The Bush administration has indicated that it will begin using agency performance information later this year when making budget decisions for fiscal year 2003. Researchers at the Mercatus Center conducted our second annual evaluation of the reports produced by the 24 agencies covered under the Chief Financial Officers Act, employing the same criteria to evaluate the fiscal year 2000 performance reports that we employed to evaluate the fiscal year 1999 reports. Our scorecard continues to address three questions:

    • Does the agency report its accomplishments in a transparent fashion that makes accomplishments and problems clear?
    • Does the report focus on documenting tangible public benefits the agency produced?
    • Does the report show evidence of forward-looking leadership that uses performance information to devise strategies for improvement?

    By assessing the quality of agency reports, we seek to ascertain which agencies are supplying the information that Congress and the public need to make informed funding and policy decisions.

    Best reports: For fiscal 2000, the Department of Veterans Affairs performance report scored highest, followed closely by the Department of Transportation. The fiscal 2000 report of the U.S. Agency for International Development ranked third, down from first place in fiscal 1999.

    Reports most in need of improvement: Of the 23 agencies that released reports by April 27, the National Aeronautics and Space Administration’s report scored lowest. The Department of Health and Human Services and Nuclear Regulatory Commission rounded out the bottom three. The Department of Agriculture released its report on May 1, more than a month after the statutory deadline and too late to be included in our analysis.

    Most improved reports: The National Science Foundation exhibited the most improvement since last year, jumping from last place in fiscal 1999 to sixth in fiscal 2000. The Department of Justice showed the next largest improvement, rising from 21st place to fifth.

    Most common strengths: (1) Readability of the reports, (2) clear articulation of results-based goals, and (3) discussion of major management challenges.
    Most common weaknesses: (1) Failure to make reports accessible to the public, (2) failure to demonstrate a cause and effect relationship between the agency’s action and observed outcomes, and (3) failure to link performance data to costs. Notwithstanding these common weaknesses, a number of agencies did well on these three attributes.

    Modest but widespread improvement: The average score was approximately 5 percent higher for the fiscal 2000 reports than for fiscal 1999 reports, in spite of the fact that scoring was more stringent. More than half of the reports scored higher in fiscal 2000 than in fiscal 1999.



    Appendix Excerpt

    TRANSPARENCY

    • Report not found on NASA web site.
    • Organization is confusing (enterprises and cross-cutting processes), summary tables are hard to
      decipher, technical language is excessive, use of acronyms is tiresome, measures are too
      numerous and editing is poor.
    • Reliability of data is suspect. The inspector general conducted audits only on “selected”
      measures, the Advisory Council review of agency performance is considered part of the data
      verification and validation process, and sources and reliability checks are all internal.
    • Trends cannot be discerned because most measures take the form of activity lists.

    PUBLIC BENEFITS

    • At the organization-wide level, NASA has not stated goals at all; instead, it has identified
      “enterprises” and “processes” – each of which contains numerous goals and objectives.
    • Most performance measures are activity-based, usually reading like a “to do” list.
    • It is very difficult to discern links between activities and the agency’s ultimate objectives. Some
      progress is evident, and since NASA has few competitors in the field, it is reasonable to credit
      them with whatever results are obtained. This also begs the question, however, of what is
      achievable.
    • No link between goals, results, and costs.

    LEADERSHIP

    • Though NASA arguably has the most exciting and visionary job of any federal agency, its report
      fails to communicate the benefits of NASA’s activities.
    • Failures at the performance target level are explained, but strategic failures are most often
      blamed on external factors.
    • International Space Station is the only major management challenge discussed.
    • No convincing discussion of how NASA plans to improve in the future.

  • SpaceRef staff editor.