OMB Watch Comments on Program Assessment Ratings Tool from OMB

The following are comments sent by OMB Watch to OMB on the Program Assessment Ratings Tool (PART) tool arising from the first meeting of the Performance Measurement Advisory Council. The comments are also available in Adobe Acrobat PDF

The following are comments sent by OMB Watch to OMB on the Program Assessment Ratings Tool (PART) tool arising from the first meeting of the Performance Measurement Advisory Council. The comments are also available in Adobe Acrobat PDF

This document requires Adobe Acrobat Reader.
To download the most recent version for free, click here
(If you cannot access the files through the links, right-click on the underlined text, click "Save Link As," download to your directory, and open the document in Adobe Acrobat Reader.)

July 3, 2002

Thomas M. Reilly
Designated Federal Officer
Performance Measurement Advisory Council
Executive Office of the President
Office of Management and Budget
7002 New Executive Office Building
Washington, DC 20503

Dear Mr. Reilly:

My name is Ellen Taylor, and I am a budget policy analyst at OMB Watch, a nonprofit research and advocacy organization that seeks to promote government accountability and citizen participation. During the past three years, OMB Watch has sought to increase the participation of nonprofit groups in the implementation of the Government Performance and Results Act (GPRA). In that effort, we have conducted surveys of nonprofits, executive agencies, and congressional staff, and hosted meetings, especially in the environmental arena, to monitor implementation and opportunities for nonprofit involvement, as well as ways for insuring that the public can verify and validate the performance results. We also maintain an email list, involving hundreds of people, to discuss issues surrounding GPRA.

I attended the first meeting of the Performance Measurement Advisory Council on June 27, where the Program Assessment Ratings Tool or ?PART? was discussed. On behalf of OMB Watch, I would first like to express our appreciation that open meetings of the Advisory Council are being held. A primary purpose of GPRA was to improve government performance and be able to measure its results in order to give citizens more confidence in their government ? both by showing measurable successes and providing tools to make improvements where needed. In order to accomplish these goals, it is important that the process of measuring government be transparent and open to the public. While we appreciate the meeting being open, it would have been more useful to reserve a period of time ? thirty minutes or so ? to allow observers to make comments, especially given the limited input on the PART process by groups outside of the government and academia.

The following are our comments on the ?PART? tool arising from the meeting of the Advisory Council.

  • In terms of the transparency and openness of the process, there was little information about the development of the PART, including its origin, whether anyone in addition to the Performance Evaluation Team assisted in the design of the tool, or how the methodology was developed. It is very important that the Advisory Council document issues concerning the development of the measurement metric.

  • We were disappointed, also, that the Advisory Council did not include any members from nonprofit advocacy or service provider groups. This would have added a different and important perspective to the work that the Council is doing. Many nonprofit groups care deeply about government performance, have used results-oriented techniques in their own work, and recognize the importance of public trust in government.

  • Even after reading the materials posted on the website and attending the meeting, we remain unsure as to the actual purpose of the PART, its relationship to GPRA, and why it is necessary. The question that was raised about whether the PART was a self-assessment tool to help agencies implement GPRA more effectively, or an exercise to combine the President?s management initiatives with GPRA performance indicators and output measurements to achieve a more simplified ?grade? of agency performance, was never fully answered.

  • We have long been suspicious of the trend to give overall ?grades? to agencies in efforts to simplify GPRA. We are especially concerned that these simplified ?grades,? like the ?traffic signal? effort in the President?s 2003 Budget, might ultimately be used to justify budget decisions. We think that would be a mistake. Would a low score on the PART suggest that the agency needs more money to improve the score, or that the program should be eliminated or reduced? A metric with uncertain measures, which we think the PART currently represents (see comments below), will not be helpful in making important budget decisions. Given the difficulty agencies have faced in fully complying with GPRA, we are concerned that the PART will only add to the burden, and possibly take some of the energy that ought to be going into developing good indicators and data for GPRA purposes.

  • There was no indication that the development or testing of the PART tool included consultations with nonprofit groups, or with nonprofit service providers responsible for implementing federal programs. Much of our effort in the first years of GPRA implementation was attempting to get nonprofit groups and agencies to work together in developing the strategic plans. Unfortunately, there is no analogous requirement for nonprofit and other stakeholder involvement in the development of the performance plans, which are the real meat and potatoes of GPRA. Nonprofits lack a venue for giving valuable input into the development of effective indicators to measure the results that are important to the public. The PART tool could be another vehicle for nonprofits to participate in the process, yet there is no evidence that this has occurred or is intended. Nonprofits should be involved not only because of their expertise, but also because they are often a partner with government in the delivery of services.

  • In addition to our concern about simplifying a very complex operation?the federal government?into overall grades, is our concern that the PART appears to be an extremely subjective instrument. The ?yes? and ?no? format, the ability for a department to choose not to answer some of the questions, and the nature of the questions, all concern us. There appears to be no underlying standard of what constitutes a ?yes? or ?no? answer, nor any allowance for showing that positive progress is being made towards achieving a goal. To be useful, subjectivity needs to be reduced and more specificity included. However, we still caution reducing the important process of determining budget priorities to a quantifiable, mechanistic approach.

  • Other than statements to the effect that ?beta testing? with some departments resulted in overall positive comments, and led to the ?Major Issues for Consideration in Revising the PART: Preliminary Recommendations by Performance Evaluation Team,? there is nothing on the website or in the materials detailing who participated in the ?Beta? test, what guidance they were given, or a report of their comments. Such information would be useful in further refining the PART tool.

  • There is nothing on the website that gives the identity of the 20% of federal departments who will be using the PART. (Advisory Council members requested this information.) The identity of the departments and the programs chosen is important to nonprofits as the PART process moves forward.

  • We agree with Advisory Council member Harry Hatry that the three elements of the PATH (program purpose, management and results) need to be separated out. The results-based information should not only be given a higher weight, but also stand alone as a score separate from the other elements of the PART. There should be both a program management score and a results score, and the results score should be the stronger indicator of program effectiveness.

If the PART instrument is intended to make the performance evaluations included in the President?s 2003 budget more ?robust, credible and transparent,? it should not do so at the expense of over-simplification. More effort needs to be made to work with the nonprofit sector if the PART is to achieve its long-term goals.

Thank you for your attention to this matter.


Ellen Taylor
Senior Budget Policy Analyst

back to Blog