PPA 502 – Program Evaluation Lecture 2b – Evaluability Assessment.

Slides:



Advertisements
Similar presentations
what is it and why do we use it?
Advertisements

[Organisation’s Title] Environmental Management System
Screen 1 of 20 Reporting Food Security Information Reporting for Results Learning Objectives At the end of this lesson you will be able to: understand.
PROJECT RISK MANAGEMENT
Administration, Management, and Coordination of Supportive Housing: Guidelines from CSH’s Dimensions of Quality MHSA TA Operations Call September 1, 2010.
1 【 1 Project description and implementation method 】 1.1. Project description (including implementation method) Trends of Regulations and Nuclear.
Chapter 2 Analyzing the Business Case.
1 Requirements and the Software Lifecycle The traditional software process models Waterfall model Spiral model The iterative approach Chapter 3.
Strengthening the Medical Device Clinical Trial Enterprise
Chapter 10 Schedule Your Schedule. Copyright 2004 by Pearson Education, Inc. Identifying And Scheduling Tasks The schedule from the Software Development.
Chesapeake Bay Program Goal Development, Governance, and Alignment Carin Bisland, GIT6 Vice Chair.
Chesapeake Bay Program Goal Development, Governance, and Alignment Carin Bisland, GIT6 Vice Chair.
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
© Grant Thornton UK LLP. All rights reserved. Review of Sickness Absence Vale of Glamorgan Council Final Report- November 2009.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
1 Regulatory Impact Assessment: Methodology and Best Practices David Shortall INMETRO International Workshop on Conformity Assessment Rio de Janeiro, Brazil.
PPA 502 – Program Evaluation Lecture 5b – Collecting Data from Agency Records.
PPA 503 – The Public Policy Making Process
PPA 502 – Program Evaluation Lecture 3b – Outcome Monitoring.
Food and Drug Administration Center for Biologics Evaluation and Research The Office of Cellular, Tissue and Gene Therapies Web Seminar Series presents:
Planning an Internal Audit JM García Merced. Brainstorm.
How to Develop the Right Research Questions for Program Evaluation
RESEARCH DESIGN.
What is Business Analysis Planning & Monitoring?
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Module CC3002 Post Implementation Issues Lecture for Week 1 AY 2013 Spring.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
MGT 461 Lecture # 19 Project Initiation Phase (I OF II)
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Discipline Flow Chart Verbal Counseling (Site Directors is responsible for this step) PERFORMANCE IMPROVED YESNO WRITTEN WARNING & ACTION PLAN CELEBRATE.
Centro de Estudos e Sistemas Avançados do Recife PMBOK - Chapter 4 Project Integration Management.
Evaluating a Research Report
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
ALI SALMAN0 LECTURE - 02 ASST PROF. ENGR ALI SALMAN ceme.nust.edu.pk DEPARTMENT OF ENGINEERING MANAGEMENT COLLEGE OF E & ME, NUST DEPARTMENT.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
Project Life Cycle – Project Initiation © Ed Green Penn State University All Rights Reserved.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Subcommittee on Design New Strategies for Cost Estimating Research on Cost Estimating and Management NCHRP Project 8-49 Annual Meeting Orlando, Florida.
The SSMP Process 1. The Servicing and Settlement Master Plan A plan to encompass the community’s visions and ideas, while approaching planning and servicing.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
Monitoring and Evaluation
How To: A Process for Successful Partnerships. Partnership Definition A partnership IS: A written agreement between the parties. Mutual interest in, mutual.
Evaluability Assessment, Formative & Summative Evaluation Laura C. Leviton, Ph.D. Senior Advisor for Evaluation.
Project management Topic 4 Business Case.
Project management Topic 7 Controls. What is a control? Decision making activities – Planning – Monitor progress – Compare achievement with plan – Detect.
Initiation Project Management Minder Chen, Ph.D. CSU Channel Islands
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Quick Recap.
Outlines Overview Defining the Vision Through Business Requirements
Aspect 1 Defining the problem - Problem: The design context will normally offer a variety of potential problems to solve. A focused problem and need is.
University of Wyoming Financial Reporting Initiative Update April 2016.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Folie 1 Sarajevo, October 2009 Stefan Friedrichs Managing Partner Public One // Governance Consulting Project Management in the Public Sector Monitoring.
EIAScreening6(Gajaseni, 2007)1 II. Scoping. EIAScreening6(Gajaseni, 2007)2 Scoping Definition: is a process of interaction between the interested public,
Dr. Vladimir Mamaev UNDP Regional Technical Advisor Integrated Natural Resource Management in the Baikal Basin Transboundary Ecosystem Russian Federation.
Introduction and Overview
PILOT SCHOOL PRINCIPAL EVALUATION
THE SUPERFUND PROCESS Assessment and Listing
FIVE PROJECT PHASES 5C-3 Sun. 8:00-10:00am 21/ 2/2016.
Guidance notes for Project Manager
Introduction and Overview
Presentation transcript:

PPA 502 – Program Evaluation Lecture 2b – Evaluability Assessment

Evaluability Assessment  Problems confronting evaluation. –Evaluators and users fail to agree on goals, objectives, side effects, and performance criteria to be used in evaluating the program. –Program goals and objectives are found to be unrealistic given the resources that have been committed to them and the program activities underway. –Relevant information on program performance is often not available. –Administrators on the policy or operating level are unable or unwilling to change the program on the basis of evaluation information.

Evaluability Assessment  Program goals, objectives, important side effects, and priority information needs are well-defined.  Program goals and objectives are plausible.  Relevant performance data can be obtained.  The intended users of the evaluation results have agreed on how they will use the information.

Key steps in evaluability assessment  Involve the intended users of evaluation information.  Clarify the intended program from the perspectives of policy-makers, managers, staff and other key stakeholders.  Explore program reality, including the plausibility and measurability of program goals and objectives.  Reach agreement on needed changes in program activities and objectives.  Explore alternative evaluation designs.  Agree on evaluation priorities and intended uses of information on program performance.

Gaining and Holding the Support of Managers  Form a policy group and a work group to involve policymakers, managers, and key staff in evaluation.  Clarify the types of products and results expected.  Use briefings to present: –The perspectives of policymakers and managers –The reality of program operations, and –Options for changes in program activities or the collection and use of information on program performance.

Clarifying Program Intent  Develop program design models documenting program resources, program activities, important intended program outcomes, and assumed causal linkages from the perspectives of key policymakers, managers, and interest groups.  Develop program design models at varying levels of detail.  Use more detailed program design models to ensure that evaluators and managers have a common understanding of the intended program, including negative side effects to be minimized.

Clarifying Program Intent (contd.)  Use less detailed program design models to focus briefings and discussions on key issues.  Develop lists of currently agreed-on performance indicators and possible new performance indicators to ensure that there is a common understanding of the goals, objectives, important side effects, and performance indicators to be used in subsequent evaluation work.

Exploring Program Reality  Focus on descriptions of actual program activities and outcomes, reviews of performance measurement systems currently in use, and description of especially strong project performance and of problems inhibiting effective program performance.  Use site visits and prior reports to make preliminary estimates of the likelihood that program objectives will be achieved.  Identify feasible measures of program performance.

Reaching Agreement on Any Needed Changes in Program Design  If appropriate, suggest changes in program design that appear likely to improve program performance.  Proceed by successive iterations, spelling out the likely costs and likely consequences of the program change options of greatest interest to program managers.

Exploring Alternative Evaluation Designs  Spell out the costs and intended uses of evaluation options: measurements of specific variables or tests of specific causal assumptions.  Present examples of the type of data to be produced.  Interact with intended evaluation users at frequent intervals.  Hold managers’ interest by providing early evaluability assessment products.  Brief key managers and policymakers on evaluability assessment findings and options.

Exploring Alternative Evaluation Designs (contd.)  Explain the implications of the “status quo option” (no further evaluation) and the costs and potential uses of various evaluation options.  Ensure that a mechanism is available for speeding initiation of follow-on evaluation procedures.

Documenting Policy and Management Decisions  Conclude each phase of an evaluability assessment with a brief memorandum documenting significant decisions made in meetings with managers and policymakers.

Proceeding by Successive Iterations  Do the entire evaluability assessment once early in the assessment; obtain tentative management decisions on program objectives, important side effects, evaluation criteria, and intended uses of evaluation information; and redo portions of the evaluability assessment as often as necessary to obtain informed management decisions.

Reducing Evaluability Assessment Costs  Minimize production of intermediate written products.  Use briefings that present the information required for management decisions.