1 Department of State Program Evaluation Policy Overview Spring 2013.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

2 Session Objectives Increase participant understanding of effective financial monitoring based upon risk assessments of sub-grantees Increase participant.
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Joint ATS-WASC Accreditation Reviews Jerry McCarthy, ATS Teri Cannon, WASC.
Quality Improvement in the ONS Cynthia Z F Clark Frank Nolan Office for National Statistics United Kingdom.
ClimDev-Africa Program & African Climate Policy Center (ACPC)
Guidance Note on Joint Programming
TCE Board Presentation February, 2006 Evaluating the Initiative Oakland, CA - Seattle, WA.
The Implementation Structure DG AGRI, October 2005
TEN-T Info Day for AP and MAP Calls 2012 EVALUATION PROCESS AND AWARD CRITERIA Anna Livieratou-Toll TEN-T Executive Agency Senior Policy & Programme Coordinator.
Project Appraisal Module 5 Session 6.
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
School Improvement Through Capacity Building The PLC Process.
Understanding Capacity Building Assistance
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Internews Network Ukraine Media Project (U-Media) Evidence-based Local Capacity Development in Ukraine October 2012.
SAI Performance Measurement Framework
Donald T. Simeon Caribbean Health Research Council
1  AGA-DC and GWSPCA 6 th ANNUAL CONFERENCE OMB Circular A-123, Appendix A Internal Control Over Financial Reporting Innovative Approaches Jerome A. Vaiana.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
1 LBNL Enterprise Computing (EC) January 2003 LBNL Enterprise Computing.
Comprehensive M&E Systems
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Office of Inspector General (OIG) Internal Audit
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Standards and Guidelines for Quality Assurance in the European
Overview of UNDAF process and new guidance package March 2010 u nite and deliver effective support for countries.
EVALUATION IN THE GEF Juha Uitto Director
Quality assurance in IVET in Romania Lucian Voinea Mihai Iacob Otilia Apostu 4 th Project Meeting Prague, 21 st -22 nd October 2010.
Culture Programme - Selection procedure Katharina Riediger Infoday Praha 10/06/2010.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
THE ROLE OF STOP TB GHANA PARTNERSHIP Chief Austin A. Obiefuna National Coordinator SECRETARIAT CO-HOSTED BY AFRO GLOBAL ALLIANCE (GH) & GHANA SOCIETY.
Exploring the use of QSR Software for understanding quality - from a research funder’s perspective Janice Fong Research Officer Strategies in Qualitative.
Evaluation in the GEF and Training Module on Terminal Evaluations
March 2015 Inter-American Network for the Prevention of Violence and Crime.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Report on the Evaluation Function Evaluation Office.
How to use the VSS to design a National Strategy for the Development of Statistics (NSDS) 1.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
REPORTING, MONITORING AND PERFORMANCE EVALUATION PROVISIONS ON NON-FINANCIALS – 2013/14 1 MIG Quarterly Workshop 3 – 4 September 2013.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points Western and Central Africa Dakar, May 2007.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
Family Service System Reform Grant Application Training Video FY Donna Bostick-Knox, Pennsylvania Department of Public Welfare, Office of Children.
Senior Evaluation Officer GEF Independent Evaluation Office Minsk, Belarus September 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Revisions Proposed to the CIS Plan by the Global Office Misha V. Belkindas Budapest, July 3-4, 2013.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
OEPA West Virginia Board of Education Policy 2320: A Process for Improving Education: Performance- Based Accreditation System RESA 6 – October, 2014 Office.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
TEN-T Executive Agency and Project Management Anna LIVIERATOU-TOLL TEN-T Executive Agency Senior Programme and Policy Coordinator European Economic and.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Internal Audit Section. Authorized in Section , Florida Statutes Section , Florida Statutes (F.S.), authorizes the Inspector General to review.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
First Things First Grantee Overview.
Country Level Programs
TechStambha PMP Certification Training
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Evaluation in the GEF and Training Module on Terminal Evaluations
Assessment of Quality in Statistics GLOBAL ASSESSMENTS, PEER REVIEWS AND SECTOR REVIEWS IN THE ENLARGEMENT AND ENP COUNTRIES Mirela Kadic, Project Manager.
Presentation transcript:

1 Department of State Program Evaluation Policy Overview Spring 2013

Overview of Policy and Status Policy was approved by Secretary on March 1, 2012 This is the first State Department Policy The policy defines evaluation as : A systematic and objective assessment of an on-going or completed project, program or policy: Evaluations are undertaken to (a) improve the performance of existing interventions or policies, (b) asses their effects and impacts, and (c) inform decisions about future programming. Evaluations are formal analytical endeavors involving systematic collection and analysis of qualitative and quantitative information. 2

Policy Application The Policy applies to evaluating the Department’s diplomatic, management and development programs, projects, and activities Evaluation is essential to the Department’s ability to: – Measure and monitor program performance – Make decisions about programmatic adjustments and changes – Document program impact – Identify best practices and lessons learned – Assess return on investment – Provide inputs for policy, planning, and budget decisions – Assure accountability The Policy intends to: – Provide a coordinated strategy and work plan for conducting evaluations – Respond to stakeholder demands for transparency in decision-making – Work in concert with existing and pending Department policies, strategies, and operational guidance 3

The Department of State Is very large and diverse It has 51 major bureaus and offices covering a huge array of fields form nuclear disarmament to health to law enforcement to human trafficking to visa and passports It has over 200 posts overseas Our policy on evaluation not only includes programs but management and diplomacy 4

Effective March 1, 2012, the evaluation policy applies to all State Bureaus and Major Offices such as S/GAC. The Policy proposes a framework to implement evaluations and is intended to provide: – Clarity about the purposes of evaluation – Evaluation requirements – Types of evaluation – An approach for conducting, disseminating, and using evaluations Evaluation at the Department has two primary purposes: APPLICABILITY & PURPOSE 5 Help determine cost effectiveness and planning and implementation quality Provide empirical data for reports to various stakeholders ACCOUNTABILITY Document the results, impact, and effectiveness of activities Inform decision-making LEARNING

Policy requires “all large programs, projects, and activities be evaluated at least once in their lifetime or every five years, whichever is less.” Programs, projects, and activities broadly defined so that bureaus can apply the policy according to the level where most of their funding is used. Beginning in FY 2012, bureaus are required to evaluate programs, projects, and activities within a 24-month period. Posts will be added in FY Bureaus must also ensure that implementing organizations carry out evaluations consistent with The Policy’s guidelines. EVALUATION REQUIREMENTS 6

Types: Performance Evaluation vs. Impact Evaluations The expectation is that most evaluations conducted will be Performance Evaluations. Whether Performance or Impact, all evaluations must be context- sensitive, independent, and methodologically sound. EVALUATION TYPES & STANDARDS 7 Examines the inputs, outputs, outcomes, and performance of an intervention PERFORMANCE Measures the change attributable to a given intervention IMPACT

Standards: Methodological Rigor & Independence and Integrity Methodological Rigor – Evaluations should be “evidence based,” meaning that they are based on verifiable data and information that have been gathered using professional evaluation standards – Data must be reliable and valid – Qualitative and quantitative data are acceptable Independence and Integrity – Bureaus must ensure that evaluators are free from any pressure and/or bureaucratic interference. – Bureau staff and managers should be actively engaged during the evaluation process, but that engagement must not improperly influence the outcome. EVALUATION TYPES & STANDARDS 8

EVALUATION USE & BUREAU EVALUATION PLANS 9 EVALUATION USE Evaluation findings must be integrated into decisions made about strategic plans, program priorities, project design, planning, and budget formulation. BUREAU EVALUATION PLANS Bureaus must develop and submit a annual Bureau Evaluation Plan (BEP) The BEP will cover a three-year period and will be updated annually.

EVALUATION RESPONSIBILITY 10 AGENCY LEVEL F and BP will work closely with the Performance Improvement Officer (PIO) to assist bureaus in implementing the policy. F – Coordinator for Foreign Assistance-funded programs, projects, and activities evaluations BP – Coordinator for State Operations-funded programs, projects, and activities evaluations F and BP to coordinate tools, technical support, funding and evaluation training to assist bureaus in implementing The Policy. BUREAU LEVEL Management Responsibility – It is the bureau’s responsibility to ensure that evaluations are planned, budgeted, and conducted. – Bureaus are asked to budget for evaluations in their 2014 BRR. Implementation Responsibility – It is the bureau’s responsibility to implement and manage evaluations.

BUREAU POINT OF CONTACT & EVALUATION RESOURCES 11 BUREAU POINT OF CONTACT Each bureau must identify a senior staff person to be the Bureau Point of Contact for evaluation – Should be a Deputy Assistant Secretary or their designee – Will be the main point of contact in the bureau on evaluation – Will interact with the PIO, F, and BP on the bureau’s evaluation efforts and compliance with the evaluation policy. EVALUATION RESOURCES Performance monitoring and evaluation are allowable Foreign Assistance program costs – Evaluation costs will vary by program, so no set amount is prescribed. – Based on international professional standards, program managers should identify resources of up to 3-5% for evaluation activities.

SUPPORT AND TECHNICAL ASSISTANCE 12 Indefinite Delivery/Indefinite Quantity (IDIQ) Contracts: Five IDIQ contracts to facilitate contractual services for evaluations: Dexis Consulting Group; Development and Training Services; DevTech Systems; Social Impact, Inc.; and Weidemann Associates. Evaluation Guidance: Covers planning for evaluations; SOWs; data collection methods; evaluation reports; using evaluation information; confidentiality; role of Bureau Evaluation Coordinator Training: Department is providing training and technical assistance (TA&T)to bureau staff. Will coordinate TA&T with Bureau Evaluation Coordinators

DOCUMENTING & SHARING EVALUATION REPORTS 13 Bureaus and posts must maintain an official copy of completed evaluation reports. Final evaluation reports will be sent to a new internal web site Bureaus and posts are required to post copies of their evaluation reports on their OpenNet or ClassNet website homepage. Summaries of completed evaluations will be reported in the Department’s Annual Performance Report.

Implementation efforts and status The first task was to build staff capacity and a culture which values evaluation We have two professional training programs operative and have trained over 100 staff thus far We have a very active ((100 members) Community of Practice group which meets monthly to pick up and share information We have 35 ongoing evaluations and over 100 planned this year 14