Performance Expectations of LFA Programmatic/ M&E Experts 1 LFA M&E Training February 2014.

Slides:



Advertisements
Similar presentations
LAO PDR Summary Findings from NOSPA Mission and Possible Next Steps.
Advertisements

MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
Campus Improvement Plans
 Capacity Development; National Systems / Global Fund Summary of the implementation capacities for National Programs and Global Fund Grants For HIV /TB.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Abt Associates Inc. In collaboration with: I Aga Khan Foundation I BearingPoint I Bitrán y Asociados I BRAC University I Broad Branch Associates I Forum.
Session V: Programme Roles and Responsibilities
IT Strategic Planning Project – Hamilton Campus FY2005.
Comprehensive M&E Systems
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
By Saurabh Sardesai October 2014.
Lessons Learned for Strong Project Delivery & Reporting Sheelagh O’Reilly, Kristin Olsen IODPARC Independent Assessors for the Scottish Government IDF.
Welcome ISO9001:2000 Foundation Workshop.
Unit 9. Human resource development for TB infection control TB Infection Control Training for Managers at National and Subnational Level.
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
ZHRC/HTI Financial Management Training
Project Implementation Monika Balode Joint Technical Secretariat Lead Partner Seminar 16 October 2009, Šiauliai.
Introduction to evaluating and measuring impact in career development Presented by – Date – The power to question is the basis of all human progress. Indira.
Training of Process Facilitators Training of Process Facilitators.
Outcomes of the 16 th Regional Disaster Managers Meeting held from 9 th – 11 th August 2010 Presentation to the Pacific Humanitarian Team Monday 6 th December.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Evaluation in the GEF and Training Module on Terminal Evaluations
Overview of New Funding Model May 17, 2013 Astana, Kazakhstan.
LOUGHBOROUGHCOLLEGE Business Support Self Assessment
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Open Society Institute, Public Health Program Proposal Development and Advocacy Seminar for Eastern and Southern Africa Cape Town, South Africa 18 February.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Wgnho Management for Performance Department of Conservation Management for Performance Project.
Commissioning Self Analysis and Planning Exercise activity sheets.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Kashif Rasheed Manager Finance. Office of inspector General (OIG) Global Fund Secretariat Country Coordination Mechanism (CCM ) Principal Recipients (PR)
Global Fund Assessments Part I: Processes and Tools Geneva – December 2005.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
M&E requirements for grant signing: M&E Plan Workshop on effective Global Fund Grant negotiation and implementation planning January 2008 Manila,
“Progress Update / Disbursement Request” (PU/DR) PSM Section PSM section, R-7 Malaria Sept 17,2011.
M&E System Strengthening Tool Workshop on effective Global Fund Grant negotiation and implementation planning January 2008 Manila, Philippines Monitoring.
Global Fund Assessments Part II: Understanding Assessment Results Geneva – December 2005.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
M ODULE 6 PART 1: Planning and Stakeholder Management GLOBAL FUND GRANT CONSOLIDATION WORKSHOP DATE.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
Internal Auditing Effectiveness
1 January 2005 Introduction to Phase 2 and General Update Lesotho CCM.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
Session 2: Developing a Comprehensive M&E Work Plan.
NFM: Modular Template Measurement Framework: Modules, Interventions and Indicators LFA M&E Training February
Capacity Assessment of Implementers LFA PSM expert workshop January 2014.
Revised OSDV Tools and Processes LFA PSM expert workshop 28-30January 2014.
Audits of Global Fund grants LFA Finance Training Workshop October-November 2013.
Updated PU/DR Guidelines and Annual Funding Decision LFA Finance Training October – November 2013.
LFA expert approval - Conflict of Interest LFA training October-November 2013.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Risk Differentiation in LFA Deliverables LFA Finance Training October-November 2013.
> 1 ACP S&T Grant Contract N° FED/2009/ Introducing WP6 Monitoring & Quality control Objectives, actions, procedures Dr Sarah Bracking.
LFA expert approval - Conflict of Interest LFA training October - November 2013.
Managing & Measuring Progress and Impact: National Program Reviews & Evaluations LFA M&E Training February
Programmatic/ M&E Experts – Requirements and Approval Process LFA M&E Training February
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
1 Programmatic and M&E Risk Identification, Management, and Mitigation LFA M&E Training February 2014.
Capacity Assessment of Implementers February 2014.
PRESENTATION OF FINDINGS GRANTEES NEED ASSESSMENT
Monitoring and Evaluation using the
April 2011.
Evaluation in the GEF and Training Module on Terminal Evaluations
Task Force on Target Setting and Reporting TFTSR
Portfolio, Programme and Project
24 January 2018 Juba, Republic of South Sudan
Project intervention logic
Presentation transcript:

Performance Expectations of LFA Programmatic/ M&E Experts 1 LFA M&E Training February 2014

Session objectives Clarity on the role of the LFA programmatic health/ M&E expert in collaboration with the Global Fund Country Team; A common understanding of performance expectations; Concrete examples of best practices. 2

Session structure 1. Small Group Discussion: LFA Self- Assessment (45 min) 2. Clarifying roles and expectations (30 min) 3. Revised PET (5 min) 4. Scenario Case Studies and Discussion (25 min) 3

LFA self-assessment: background The LFA M&E/Programmatic expert plays a key role by verifying PR results, identifying potential risks to the grant or program, and translating key contextual information to improve decision-making. This information is most frequently transmitted to the CT through reporting documents, including the OSDV, RSQA, PU/DR, and LFA assessments. 4

LFA self-assessment: directions We will present four Guiding Questions. Please take the next 25 minutes to discuss them in small groups (tables). Then we will reconvene to share our answers (20 min). 5

LFA self-assessment: guiding questions Describe a time when you felt that you had added value to a particular deliverable or outcome. Why do you think you added value? Describe a time when you felt that you had missed an opportunity to add value. Why do you think you were not able to add value? What could you have done differently in order to add value? What could the Secretariat Country Team have done differently to have helped you add value? 6

Structure of Secretariat CT vs. LFA team FPM PO Public Health/ M&E Specialist Legal Officer PSM Specialist Team Leader LFA Programmatic/ M&E expert Finance Expert PSM Expert Finance Specialist SECRETARIAT CT LFA Team 7

Global Fund Public Health/ M&E Specialist LFA Programmatic/ M&E expert Roles and responsibilities Provide information about PR capacity in M&E and program implementation Verify programmatic results for disbursement and renewals recommendations Identify potential M&E/programmatic risks, including data and service quality issues Verify progress against management actions/conditions Identify relevant changes in country context (e.g. epidemiological, policy or political changes) Advise on programmatic aspects of grants Negotiate performance targets and establish/follow up on management actions/conditions Assess progress to make funding decisions and identify areas for improvement Enable M&E systems strengthening and technical assistance provision Identify and respond to M&E and programmatic risks in the portfolio 8

Global Fund Public Health/ M&E Specialist LFA Programmatic/ M&E expert LFA provides essential information for decision-making Inputs for improving grant/program Site data Site visits Community Local media LFA assessment RSQA OSDV PU/DR Communications 9

Global Fund Public Health/ M&E Specialist Information sources for decision-making National Strategic Plan M&E plan Evaluations and program reviews Technical partners 0-4 country visits/year Imagine being a PHME Specialist at the Secretariat. Could you fulfill your responsibilities based on these information sources alone? If not, what would you need? 10

Common information gaps What is “on paper” versus actually being implemented? What works well within the system and what are the gaps? Why are there gaps in service delivery coverage or quality? Are there any current or potential changes in the country context (political, economic, social) that may affect the program? Are there key groups or populations within the country that are not being reached? What are the barriers to reaching them? Does the policy and program framework address these barriers? What are the relationships like between PR and SRs and how can those be strengthened to improve the program? 11

What are effective LFA practices? LFA Deliverable Examples of Best Practices Program Design Proactively identify programmatic gaps and advise Secretariat Note when practice differs from guidelines/protocols. LFA is best placed to provide guidance on realities in country Possess practical knowledge of the M&E system in-country and be able to advise PH/M&E officer on its actual functioning e.g. data completeness (does reporting cover public/private sector sites? community?); data integration (e.g. disease- specific with HMIS?); systems (e.g. computerization? at what levels?) Possess knowledge of in-country stakeholders beyond PRs and be able to facilitate meetings for PH/M&E officer if necessary Proactively monitor implementation of studies, surveys, evaluations in-country and share reports with Secretariat 12

What are effective LFA practices? LFA Deliverable Examples of Best Practices Grant- Making Actively participate in grant making and provide Secretariat with advice on issues that may affect implementation Advise on indicator selection based on difficulties noted in PU/DRs (e.g. data sources, indicator definitions, definition of package of services, reporting flows) Provide detailed descriptions of the M&E system, including examples of recording and reporting forms Grant implement- ation Make recommendations within the grant context of upcoming review Identify the needed strengthening measures and/or budget item which could provide the resources LFA Programmatic/M&E expert triangulates information with LFA PSM and LFA Finance experts 13

What are effective LFA practices? LFA Deliverable Examples of Best Practices PU/DR Identify issues that are not strictly related to the task at hand but nevertheless important Make the link between performance and funding, in particular in case of large discrepancies between grant expenditure rate and target achievement Proactively identify issues with indicator definition or reporting system, and propose sound recommendations so that these can be corrected for the next reporting period How overall performance relates to progress towards outcome/impact Identification of M&E risks and mitigation measures OSDV/ RSQA Good selection of indicators and well documented methodology to be used including the source documents and any proxy indicators LFA links the OSDV proposal (indicator/site selection) with issues identified in past OSDVs, PU/DRs 14

What are less effective LFA practices? LFA Deliverable Examples of Less Effective Practices Grant-Making LFA tries PF negotiation (rather than PH/M&E officer); LFA “plays the role” of PR LFA is unprepared to provide the necessary support to PH/M&E officer Grant implement- ation Use of external consultants for short-term assignments which require specific knowledge of the country disease program Findings based on limited experience (e.g. one site) generalized and made into an overall finding - need to be contextualised Lack of prioritization of recommendation and instead present a “laundry list” of recommendations 15

What are less effective LFA practices? LFA Deliverable Examples of Less Effective Practices PU/DR Lack of specificity/consistency, e.g. “target in PF does not match PU/DR target” Vague language which is non-committal and does not give confidence in the findings, e.g. "the LFA considers the PR's explanations plausible” Lack of comprehensive investigation into issues identified and the underlying factors Repeating what has been noted by the PR (in PU/DRs) OR repetition across several findings. Findings should be grouped/classified - as it facilitates analysis and formulation of recommendations [relevant findings should be cited] Numerators/denominators are not provided (or not verified) for percentage targets Where PF is quarterly and PU/DR is for the semester, LFA does not ensure that both quarters are reported in the PU/DR For issues with indicator definitions or reporting, LFA makes recommendations to the PR without checking beforehand with the Secretariat 16

What are less effective LFA practices? LFA Deliverable Examples of Less Effective Practices OSDV/ RSQA Vague, non-actionable recommendations (e.g. “Supervisions should be strengthened”) Deviation from what was agreed in OSDV Proposal without informing the Secretariat Unrealistic timelines 17

Plenary discussion: information gaps and communication practices What are other important communication gaps? What are good practices in communication between LFA and Secretariat, and between LFA and PR? Which communication practices are less effective? 18

LFA communication Communication skills are critical to building and managing relationships with stakeholders Build constructive and professional relationships based on mutual respect and transparency Manage expectations of other parties Maintain sufficient professional distance to make judgment in the best interest of the GF Clarify your role and responsibilities. 19

LFA PET The LFA Performance Evaluation Tool (PET) is a management tool for providing regular, structured and meaningful feedback to LFAs on their performance. Objective: LFAs deliver consistent high-quality, tailored to risk, timely, relevant and best-value services to the Secretariat in line with the Secretariat's expectations and requirements. 20

LFA PET – key changes Current PETRevised PET PETs are completed for certain key LFA services. Mandatory: 2 PETs per year (1 PET per year for ≤$350k LFA budget countries) which cover all services provided by the LFA during that period. Voluntary: separate PET for a specific service or more frequent overall feedback. Only the 6 main LFA services were evaluated (PU/DR, PR assessment, Grant Renewal, OSDV, Audit reports related and Grant Closure) Feedback covers overall performance and all services provided by the LFA during a specific period 21

LFA PET – key changes (cont’d) Current PETRevised PET No major changes in the areas of performance to be assessed: Completeness/ accuracy/ clarity Analysis and consistency Practicality of recommendations Timeliness/ responsiveness/ proactivity/ communication Other Average rating is calculated automatically. FPM will have the possibility to adjust the automatically calculated rating, if necessary. 22

LFA PET – key changes (cont’d) Current PETRevised PET LFA response/comments are not included in the form. LFA response, comments and proposed action plan will be included in the form; If no LFA comments received within 15 days of submission to the LFA – PET is closed automatically; Final approval from the FPM will be added to reflect on the LFA comments/response; FPM/CT will have a chance to manually change the rating and add comments, if considered necessary; Key submission dates will be saved automatically in the form. 23

Case study exercise Please find on your table four short scenarios. Please review and discuss the scenario which is highlighted. Each table will only discuss one of the four scenarios. Following your review of the case study, please discuss this question: “What should the LFA do, and what should the LFA not do in this scenario? 24

Final comments or questions? Thank you! 25