The Program Assessment Rating Tool (PART)

Slides:



Advertisements
Similar presentations
SYSTEM OF EVALUATION AND MANAGEMENT CONTROL RESULTS-BASED BUDGETING THE CHILEAN EXPERIENCE Heidi Berner H Head of Management Control Division Budget Office,
Advertisements

1 Budgeting for Performance in the U.S. Using the Program Assessment Rating Tool J. Kevin Carroll U.S. Office of Management and Budget July 2008.
1 DEVELOPING S.M.A.R.T. OBJECTIVES Judith Ellis, M.S. Lead Public Health Advisor Substance Abuse Mental Health Services Administration Center for Substance.
Overarching Roles of Critical Partners In A Project 9:30 – 10:00 Rob Curlee, FMO Joseph Dominque, OCISO Mike Perry, EA.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
Office of the Auditor General of Canada The State of Program Evaluation in the Canadian Federal Government Glenn Wheeler Director, Results Measurement.
ORC TA: Medicare Rural Hospital Flexibility Grant Program HRSA U.S. Department of Health & Human Services Health Resources & Services Administration.
1 Performance Budgeting and Performance Management in the U.S. Government: Lessons from the PART Initiative John Pfeiffer U.S. Office of Management and.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,
6 Key Priorities A “scorecard” for each of the 5 above priorities with end of 2009 deliverables – with a space beside each for a check mark (i.e. complete)
Knowing What ¢ount$: Connecting Performance to the Budget
Implementation and follow up Critically important but relatively neglected stages of EIA process Surveillance, monitoring, auditing, evaluation and other.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
BUILDING BLOCKS TO EVALUATE MEASURABLE PROGRAM OUTCOMES AKA: PROGRAM MONITORING.
OVERVIEW OF FEDERAL PERFORMANCE MANAGEMENT Community-Based Child Abuse Prevention (CBCAP) and Promoting Safe and Stable Families (PSSF) Grantees Meeting.
1 Welcome! Choose a photo from the table that appeals to you or represents you in some way. Write the answers to the following questions on a 3×5 notecard.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Improving Program Performance in the United States 3 rd Annual Meeting of OECD Senior Budget Officials Network on Performance and Results OECD Paris May.
Delivering Results and Satisfaction: Strategic Planning for Information Professional Leaders and Managers SLA 2016 Annual Conference Donald Malcolm Smith,
JMFIP Financial Management Conference
BUILDING BLOCKS TO EVALUATE MEASURABLE PROGRAM OUTCOMES
Internal Control Principles
Audit of predetermined objectives
Welcome to the Annual Meeting of Title I Parents
Louisiana's Title I, Part C, Migrant Education Program (MEP)
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Clinical Practice evaluations and Performance Review
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Working with your AoA Project Officer
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
Add your school name and the date and time of the meeting
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Advances in Aligning Performance Data and Budget Information:
Welcome to the Annual Meeting of Title I Parents
Research Program Strategic Plan
VERMONT INFORMATION TECHNOLOGY LEADERS
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Joint session with IHP+ introduction
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Implementation Guide for Linking Adults to Opportunity
SRH & HIV Linkages Agenda
Welcome to the Annual Meeting of Title I Parents
Using Data to Monitor Title I, Part D
Jim Edward Acting Director Chesapeake Bay Program Office May 23,2018 EPA’s Draft Final Phase III WIP Expectations.
Welcome to the CIS Annual Meeting of Title I Parents
Performance Management Training
Capital Improvement Plans
Welcome to the Annual Meeting of Title I Parents
MAP-IT: A Model for Implementing Healthy People 2020
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Annual Meeting of Title I Parents
Welcome to the Annual Meeting of Title I Parents
Presentation transcript:

The Program Assessment Rating Tool (PART) Mary Cassell Office of Management and Budget April 28, 2011

Overview What is the PART? How was it developed? What are the components? Quality controls How was the PART used? Budget Program Improvements Lessons learned

…all others, bring data.” “In God we trust… …all others, bring data.” -W. Edwards Deming

Introduction The PART was a component of the Bush Administration’s Management Agenda that focused on Budget and Performance Integration The PART promoted efforts to achieve concrete and measurable results The PART supported program improvements CJ

What is the Program Assessment Rating Tool (PART)? A set of questions that evaluates program performance in four critical areas: Program Purpose and Design Strategic Planning Program Management Program Results and Accountability A tool to assess performance using evidence Provides a consistent, transparent approach to evaluating programs across the Federal government CJ

Why PART? Measure and diagnose program performance Evaluate programs in a systematic, consistent, and transparent manner Inform agency and OMB decisions on resource allocations Focus on program improvements through management, legislative, regulatory, or budgetary actions Establish accountability for results

How did the PART work? Answers to questions generated scores which are weighted to tally to a total score. Based on evidence, evaluations, and data Ratings based on total scores: Effective, Moderately Effective, Adequate, Ineffective. Results Not Demonstrated assigned to programs that do not have performance measures or data, regardless of overall score.

PART Questions and Process Roughly 25-30 analytical questions; explanations and evidence are required Standards of evidence hold programs to a high bar Question weight can be tailored to reflect program specifics Interactions between questions. Yes/No answers in diagnostic sections. Four levels of answers in results section. Collaborative process with agencies; OMB had the pen.

How was the PART developed? Designed by 12 OMB career staff, including one representative from each division Piloted with about 60 programs Pilot generated extensive input from agencies that resulted in several revisions – changes in scoring, elimination of a question about whether the program served an appropriate federal roles Conducted trial runs with research institutions Agency roll-out: OMB training Agency meetings Agency trainings Incorporation into 2002 budget decisions and materials Development, pilot, and revision process took about 6 months, including development of guidance and training.

PART Program Types Direct Federal Competitive Grant Block/Formula Grant Regulatory Based Capital Assets and Service Acquisition Credit Research and Development

PART Questions Section I: Program Purpose & Design (20%) Is the program purpose clear? Does the program address an existing problem or need? Is the program unnecessarily duplicative? Is the program free of major design flaws? Is the program targeted effectively? Section II: Strategic Planning (10%) Does the program have strong long-term performance measures? Do the long-term measures have ambitious targets Does the program have strong annual performance targets? Does the program have baselines and ambitious targets? Do all partners agree to the goals and targets? Are independent evaluations conducted of the program? Are budgets tied to performance goals? Has the program taken steps to correct strategic planning deficiencies?

PART Questions Section III: Program Management (20%) Does the program collect timely performance information and use it to manage? Are managers and partners held accountable for program performance? Are funds obligated in a timely manner? Does the program have procedures (IT, competitive sourcing, etc) to improve efficiency? Does the program collaborate with related programs? Does the program use strong financial management practices? Has the program taken meaningful steps to address management deficiencies? Additional questions for specific types of programs. Section VI: Program Results (50%) Has the program made adequate progress in achieving its long-term goals? Does the program achieve its annual performance goals? Does the program demonstrate improved efficiencies? Does the program compare favorably to similar programs, both public and private? Do independent evaluations shows positive results?

Performance Measures, Data and Evaluations Strong focus on performance measures. Performance measures should capture the most important aspects of a program’s mission and priorities. Key issues to consider: 1) performance measures and targets . 2) focus on outcomes whenever possible. 3) annual and long-term timeframes. Efficiency measures required Rigorous evaluations are strongly encouraged

Quality Controls The PART is a tool used to guide a collective analysis-not a valid and reliable evaluation instrument. Therefore it required other mechanisms to promote consistent application. Guidance and standards of evidence Training On-going technical assistance Consistency check Appeals process Public transparency

How was the PART used? A Focus on Improvement Every program developed improvement plans Focus on findings in the PART assessments Implementation of plans and report on progress Reassessments occurred once the program has made substantive changes

The Use of the PART in the Budget Process Informed budget decisions (funding, legislative, and management) Increased prominence of performance in the Budget Increased accountability and focus on data and results

Example: Migrant Education and the PART Collaborative process between OMB and program office. Program office provided evidence to back up PART answers (such as monitoring instruments, State data, action plans, etc.) OMB and ED met to discuss evidence OMB and ED shared PART drafts ED developed follow-up actions.

Migrant Education PART PART Findings: Program is well-designed and has a good strategic planning structure Program is well-managed Issues relating to possible inaccuracies in the eligible student count are being addressed States are showing progress in providing data and in improving student achievement Results section: Ensure all States report complete and accurate data Continue to improve student achievement outcomes Improve efficiencies, in particular in migrant student records transfer system Complete a program evaluation Areas for Improvement and Action Steps for Migrant Education Complete national audit of child eligibility determinations Implement and collect data on Migrant Student Information Exchange (MSIX) Use data, in particular on student achievement, to improve performance

Distribution of Ratings Government-wide The Process Distribution of Ratings Government-wide 45% 75% 55% The following are examples of commonly known and/or large programs by rating (based on FY 2006 estimates): (need some basic info on theses in case questions arise.) Effective: Ineffective: Community Health Centers (2002) Health Professions (2002) Homeless Assistance Grants, Competitive (2005) CDBG (2003) Customs and Border Protection - Border Security Inspections and Trade Facilitation at Ports of Entry’s Program (2005) Moderately Effective: Results Not Demonstrated: Corps Emergency Management (reassessment, 2004) The Emergency Food Assistance Program (2005) Medicare (2003) Flood Damage Reduction (2002) Adequate: Disability Compensation (2002) IRS Tax Collection (2002) VA Medical Care (reassessment, 2003) Head Start (2002) AmeriCorps (reassessment, 2005) Space Shuttle (reassessment, 2005) NOTE: There may be a spike in ineffective and/or RND program ratings next year as some of the more difficult/complex programs (?) are finally assessed. 25% Ratings / Year 2002 2003 2004 2005 Effective 6% 9% 13% 15% Moderately Effective 24% 19% 22% 23% Adequate 14% 25% Ineffective 5% 3% 1% Results Not Demonstrated 51% 31% 28%

Department of Education Cumulative Ratings The Process Department of Education Cumulative Ratings The following are examples of commonly known and/or large programs by rating (based on FY 2006 estimates): (need some basic info on theses in case questions arise.) Effective: Ineffective: Community Health Centers (2002) Health Professions (2002) Homeless Assistance Grants, Competitive (2005) CDBG (2003) Customs and Border Protection - Border Security Inspections and Trade Facilitation at Ports of Entry’s Program (2005) Moderately Effective: Results Not Demonstrated: Corps Emergency Management (reassessment, 2004) The Emergency Food Assistance Program (2005) Medicare (2003) Flood Damage Reduction (2002) Adequate: Disability Compensation (2002) IRS Tax Collection (2002) VA Medical Care (reassessment, 2003) Head Start (2002) AmeriCorps (reassessment, 2005) Space Shuttle (reassessment, 2005) NOTE: There may be a spike in ineffective and/or RND program ratings next year as some of the more difficult/complex programs (?) are finally assessed. Ratings / Year 2002 2003 2004 2005 Effective 6% 9% 13% 15% Moderately Effective 24% 19% 22% 23% Adequate 14% 25% Ineffective 5% 3% 1% Results Not Demonstrated 51% 31% 28%

Lessons Learned Pros Focus on results, data, performance measurement, evaluation Program improvements Common analysis Transparency Cross-program and cross-agency comparisons between similar programs Identification of best practices Informed budget descisions

Lessons Learned Cons Not consistent enough to allow trade-offs between unlike programs Better for program improvement than accountability, unless coupled with strong evaluation Became too burdensome Not fully embraced by agencies or Congress