Programme Criticality Preparatory Webinar

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Guidance Note on Joint Programming
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
1 Professionalising Programme & Project Management Developing programme & project management capacities for UNDP and national counterparts External Briefing.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
Overview of UNDAF process and new guidance package March 2010 u nite and deliver effective support for countries.
Codex Guidelines for the Application of HACCP
1 Capacity Building: Strategy and Action Plan GEF-UNDP Strategic Partnership Capacity Development Initiative.
Global Action Plan and its implementation in other regions Meeting for Discussion of the draft Plan for the Implementation of the Global Strategy to Improve.
Ten years of Evaluability at the IDB Yuri Soares, Alejandro Pardo, Veronica Gonzalez and Sixto Aquino Paris, 16 November, 2010.
Evaluation in the GEF and Training Module on Terminal Evaluations
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
“Learning from Existing Evaluation Practices on the Impacts and Effects of Intellectual Property on Development” Geneva 6th/7th October 2011 Evaluation.
1 GROUP 1 - OUTPUT OBJECTIVES 1 and 2 (ENGLISH/BCS group) o PRVA GRUPA 1 - CILJEVI IZLAZNOG REZULTATA 1 i 2 (ENGLESKO/BHS grupa)
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Building our Future: Programme Board TOR PURPOSE To be the governing forum for the design & effective delivery of the Building our Future Programme To.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
SESSION 3: FROM SETTING PRIORITIES TO PROGRAMMING FOR RESULTS.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
WHO EURO In Country Coordination and Strengthening National Interagency Coordinating Committees.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Overview of the Project Cycle Phases What’s new. “Act” Delivering through projects (Part 1)
Safety and Security in the Field
Coordination Performance Survey Validation workshop May 2016.
Understanding DWCPs, tripartite process and role of Trade Unions How the ILO works at a national level.
Assessments ASSESSMENTS. Assessments The Rationale and Purpose for Assessments.
SUNY Maritime Internal Control Program. New York State Internal Control Act of 1987 Establish and maintain guidelines for a system of internal controls.
Pipeline Management Introducing the ATLAS pipeline management module
Small Charities Challenge Fund (SCCF) Guidance Webinar
The New Performance Appraisal Tool for RCs and UNCTs
Office 365 Security Assessment Workshop
Understanding DWCPs, tripartite process and role of Trade Unions
Well Trained International
Annex III to BS/SC/PDF/A(2003)1
EIA approval process, Management plan and Monitoring
Purpose and Process of WHO guideline development
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 12. Risk Management.
WP1 - Consortium coordination and management
BANKING INFORMATION SYSTEMS
Department of Political Science & Sociology North South University
Twelfth Policy Board meeting Lima, Peru 8-9 July 2014
Work Plan Management GEO Work Plan Symposium 30 April – 2 May 2012
Description of Revision
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Small Charities Challenge Fund (SCCF) Guidance Webinar
UNICEF Plan for Global Evaluations
Asia-Europe Meeting (ASEM) Asian Financial Crisis Response Trust Fund Review Overview of Progress 6/28/2006.
The Strategic Information Technology Formulation
The Assessment of Results & Competencies (ARC) for RCs and UNCTs
Claire NAUWELAERS, independent policy expert
Vijay Mauree, Programme Coordinator ITU
Online Session 4.2: Designing Methodologies and Data Collection
The ERA.Net instrument Aims and benefits
Online Session 4.1: Designing Methodologies and Data Collection
Continuity Guidance Circular Webinar
Evaluation in the GEF and Training Module on Terminal Evaluations
IPET-OPSLS/CCl-17 relevant issues before EC-70
COSO I COSO II. Meycor COSO, a Comprehensive Solution for Enterprise Risk Management (ERM)
Helene Skikos DG Education and Culture
North Iowa Community Action
Understanding DWCPs, tripartite process and role of Trade Unions
Guidelines for Establishing a National Standardization Secretariat
Portfolio, Programme and Project
Developing a shelter strategy
24 January 2018 Juba, Republic of South Sudan
Worcestershire Joint Services Review
Understanding DWCPs, tripartite process and role of Trade Unions
ISSUE MANAGEMENT PROCESS MONTH DAY, YEAR
Portfolio Committee on Communications
Presentation transcript:

Programme Criticality Preparatory Webinar 1

Structure of this Presentation Programme Criticality - Overview Key Principles Why undertake a PC assessment? What for? Using the results of a PC assessment in day-to-day decision-making: The “Acceptable Risk Model” Programme Criticality Assessment: Methodology Key elements of the process Overview: 8 Steps of a PC assessment Explanation of each step Who should attend the PC workshop and why? Final Remarks Background and Learning Resources Q&A 2

1. Introduction to Programme Criticality

1. Introduction to Programme Criticality Key Principles Programme Criticality = common UN system framework for decision-making on acceptable security risk. Mandatory in high risk environments, applying to all UN personnel. Establishes guiding principles and a systematic structured approach to ensure that programme activities can be balanced against security risks. Defines four levels of programme criticality (PC1-PC4). UN personnel includes international and national staff members, UN consultants as well as individually deployed military and police personnel. 4

Programme Criticality Working Group 1. Introduction to Programme Criticality What for? Using PC results Programme Criticality Working Group A Programme Criticality assessment will help decision-making on: What can be delivered where with the presence of UN personnel Programme delivery strategies Where/what further security risk management measures may be needed Planning for business continuity Adjusting possible personnel deployments

UN Guidelines for Acceptable Risk 1. Introduction to Programme Criticality UN Guidelines for Acceptable Risk April 2009 : CEB approved UN Security Management System Guidelines for Acceptable Risk. Do not accept unnecessary risk (There is no benefit in accepting any risk if it does not help towards UN’s objectives) Accept risk when benefits outweigh costs (Cannot eliminate all risks – too rigid and costly. Avoiding all risks does not help UN’s objectives) Make risk management decisions at the right level (Take decisions on risks at the level of delegated authority. Do not assume risk for which authority has not been received.) These are the risk management principles that apply to the “Acceptable Risk” model the CEB has approved. We will see later how these connect to Programme Criticality. 6

Programme Criticality Working Group 1. Introduction to Programme Criticality What for? Using PC results Programme Criticality Working Group PC is a decision making tool to make legitimate and justifiable decisions on acceptance of risk PC level + Statement of present risk (SRM) = Acceptable Risk

PC in day-to-day decision-making: The “Acceptable Risk Model” 1. Introduction to Programme Criticality PC in day-to-day decision-making: The “Acceptable Risk Model” Present Risk Unacceptable Very High High Medium Low Programme Criticality Level N/A PC1 = - Life-saving activities (at scale) - or - - Any activity endorsed by the SG (Principal’s approval to implement in very high risk) PC2 PC3 PC4 Likelihood Impact Negligible Minor Moderate Severe Critical Very likely Low Medium High Very Unacceptable Likely Moderately Unlikely unlikely Security Risk Assessment 2 rating criteria: Contribution to each in-country UN strategic result Likelihood of implementation balancing risks and programme criticality Programme Criticality

2. Programme Criticality Assessment Methodology

Key elements of the process 2. Programme Criticality Methodology Key elements of the process The assessment must be done by the whole UN country presence in peer review format. The methodology must be applied as defined in the framework. Existing planning frameworks form the basis for the assessment. A PC assessment is not a programme planning process. Criticality means impact of an activity on population, not necessarily on an organisation.

2. Programme Criticality Methodology 8 steps of PC Assessment Establish geographical scope and timeframe List strategic results List UN outputs Assess contribution to strategic results (peer review) Assess likelihood of implementation (peer review) Evaluate activities/outputs with PC1 criteria View PC level results, form consensus and approve final results Agree on a process to manage and implement the results of the PC assessment Preparatory steps PC peer review phase

2. Programme Criticality Methodology 1. Geographical scope and timeframe Geographic scope should be matching the programmatic context. Can be the same as the SRM Area, but does not have to be. Timeframe can be anything from 3-12 months. During the chosen timeframe, there should be no expected change in programmatic context. Can be aligned with planning and project cycles. A review/rollover every 6 months is recommended.

2. Programme Criticality Methodology 2. List strategic results (maximum 6) Should be based on existing planning frameworks (e.g. UNDAF, HRP, ISF, Joint Action Plan). PC is not a planning or prioritization exercise! Should reflect an accurate balance of the UN’s collective priorities for the geographic area and specified timeframe. Should capture the totality of UN outputs in the area of coverage. Must be leadership endorsed, in advance of rating exercise. Must be clear to all what they mean/include. Programme Criticality

2. Programme Criticality Methodology 2. Examples of strategic results Example for strategic results

2. Programme Criticality Methodology 3. List UN outputs UN RBM: Combining activities output level, in order to rate them against Strategic Results. Only list outputs that involve UN personnel. Ensure consistency in formulation of outputs. This matters for how ratings are done. Programme Criticality

2. Programme Criticality Methodology 3. Examples of outputs Programme Criticality

4. Rating the contribution to strategic results 2. Programme Criticality Methodology 4. Rating the contribution to strategic results Make use of the entire rating scale (0-5) to rate the contribution of an output against strategic results. As an example, the rating scale could be defined as follows: “5” = ‘very high contribution to success’/when an output fully and directly contributes to a strategic result “3 or 4” = when there is a secondary but notable contribution of the activity to the strategic result “1 or 2” = when the activity may make a small, indirect contribution to the strategic result “0” = ‘no contribution’ / when there is no contribution or linkage to this strategic result Discuss how to score ‘enablers’, e.g. coordination, management, operations, logistics support, etc.

4. Example: Contribution to strategic results 2. Programme Criticality Methodology 4. Example: Contribution to strategic results Example (screenshot from PC tool)

5. Rating the likelihood of implementation 2. Programme Criticality Methodology 5. Rating the likelihood of implementation Collectively agree on criteria to measure likelihood of implementation of activities (not including security). For example: acceptance (government, community), etc. staff capacity, partner implementing capacity, funding, logistics, physical access (roads, air strips, seasonal climatic conditions, etc.) Guiding question: ‘how do you know you can do this?’ Use entire rating scale (1-5); e.g. “5” = very likely implementation, all criteria are met (practically guaranteed) “1” = very unlikely implementation, none of the criteria currently met (practically impossible)

5. Rating the likelihood of implementation 2. Programme Criticality Methodology 5. Rating the likelihood of implementation Example (screenshot from PC tool)

The PC tool then generates preliminary ratings: 2. Programme Criticality Methodology The PC tool then generates preliminary ratings: Example (screenshot from PC tool)

6. Evaluating outputs with PC1 criteria 2. Programme Criticality Methodology 6. Evaluating outputs with PC1 criteria Two criteria for PC1: The activity is assessed as lifesaving (humanitarian or non-humanitarian) at scale (defined as any activity to support processes or services, including needs assessments), that would have an immediate and significant impact on mortality; Directed activity: the activity receives the endorsement of the Office of the Secretary-General for this particular situation. Activity is PC1 and implemented where very high present risk: Certification by Executive Head of the relevant UN entity required. Final approval given by the USG DSS.

2. Programme Criticality Methodology Final steps Step 7. View PC level results, form consensus within the UN system and approve final results Once agreed by the programme managers/peer reviewers, the final results must be validated by the UN team in country and approved by the RC or SRSG/Head of Mission, as applicable If consensus is not reached at country level, the PC Steering Group at ASG level can mediate and/or ultimately decide. Step 8. Agree on a process to manage and implement the results The final step is to implement the results of the PC assessment. This entails using the results with the relevant SRA(s) to determine acceptable risk levels for UN outputs. Additional risk mitigation measures may be required for certain outputs. It is recommended to establish a PC Working Group that regularly reviews the PC assessment and advises when a full revision is necessary.

Who should attend the PC peer review exercise? 2. Programme Criticality Methodology Who should attend the PC peer review exercise?  Heads of Programme (Dep. Rep.) and/or Senior Managers PC focal points must have a full overview and understanding of all of their entity’s activities. UN entities should ensure that their Focal Points are available to participate throughout the entire PC assessment. All PC participants need to have a sufficient understanding of Programme Criticality, for example by having read the Programme Criticality Framework and completed the e-course on Programme Criticality.

Final Remarks Present security risk level of Unacceptable or Very High are rare Move from Acceptable Threat to Acceptable Risk decisions PC inputs (and fits logically) into the Acceptable Risk Model SRM empowers programmers – efforts to lower the risk can be reflected in the SRM The higher the risk, the higher the level of decision PC Assessments are done only for programmes that involve UN personnel PC assessment done jointly by UN system in country (peer review) `

3. Background & Learning Resources

3. Background and Learning Resources Programme Criticality on the CEB website UN System’s Programme Criticality Framework (approved 2013) E-Course on Programme Criticality (approx. 30 minutes) Additional guidance and resources will be provided directly to the PC focal point in country for circulation to all participants PC co-chairs: Rebecca Jovin (rebecca.jovin@undp.org), Simon Butt (butt2@un.org) PC Secretariat: Bastian Richter (bastian.richter@undp.org) 27

4. Questions