Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Post 16 Citizenship Liz Craft Valuing progress Celebrating achievement.
School Based Assessment and Reporting Unit Curriculum Directorate
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
The Most Significant Change Technique (MSC) Dr Jessica Dart Clear Horizon.
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
School Improvement Work Day. Continuous School Improvement The Model of Process Cycle for School Improvement provides the foundation to address school.
Performance management guidance
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
PPA 502 – Program Evaluation
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
A Healthy Place to Live, Learn, Work and Play:
Challenge Questions How good is our strategic leadership?
Evaluation. Practical Evaluation Michael Quinn Patton.
Learning and Development Developing leaders and managers
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Performance-Based Assessment June 16, 17, 18, 2008 Workshop.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
IWRM PLAN PREPARED AND APPROVED. CONTENT Writing an IWRM plan The content of a plan Ensuring political and public participation Timeframe Who writes the.
Pathways through Participation: investing in research leading to action Véronique Jochum (NCVO) Diane Warburton (Involve) 7 December 2010, SRA annual conference.
February 8, 2012 Session 3: Performance Management Systems 1.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Strategic Planning. Definitions & Concepts Planning: is a scientific approach for decision making. Planning: is a scientific approach for decision making.
Authentic Assessment Principles & Methods
Results-Based Management
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Prof. György BAZSA, former president Hungarian Accreditation Committee (HAC) CUBRIK Workshop IV Beograd, 13 March, 2012 European Standards and Guidelines.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Structure of the HFA2 Input Paper. Outline Introduction and Rationale Process of consultation in the Asia Pacific region Synthesis of the AP experience.
Teaching and Supporting Students with Vision Impairments An Australian Universities Teaching Committee Funded Project WAANU Conference March 2005.
Evaluation. HPS is a “change” process that takes place within a school community A key consideration is that the change needs to be sustainable.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
School Effectiveness Framework Building effective learning communities together October 2009 Michelle Jones Professional Adviser WAG.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
SACS-CASI Accreditation and the Library Media Program in Public Schools Laura B. Page.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Strategic Planning Crossing the ICT Bridge Project Trainers: Lynne Gibb Sally Dusting-Laird.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Session 2: Developing a Comprehensive M&E Work Plan.
Initial Project Aims To increase the capacity of primary schools in partnership with parents to implement a sustainable health and sexuality education.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Chapter 1 Toward Accountability. The Faces of Accountability Professional Accountability.Service Delivery Accountability.Coverage Accountability.Cultural.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
GCM Community Involvement Tool Kit Glenburn Lodge, Muldersdrift, South Africa November 27-28, 2007.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
WORKSHOP ON PROJECT CYCLE MANAGEMENT (PCM) Bruxelles 22 – 24 May 2013 Workshop supported by TAIEX.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
School – Based Assessment – Framework
Department of Political Science & Sociology North South University
TAA04 TRAINING AND ASSESSMENT PACKAGE
Introduction to Program Evaluation
Summation of discussions towards successful implementation of RPL
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
Presentation transcript:

Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz

Monitoring and Evaluation Frameworks The development of Monitoring and Evaluation Frameworks that integrate the two functions is an increasingly prevalent requirement of program design & development

Monitoring and Evaluation Frameworks Questions Development of M&E Frameworks about process as well as the product? Degrees of consultation participation? Realistic time commitments set aside for the tasks?

The Monitoring & Evaluation Framework as a Product Introduction: Approach to M&E Program Profile: Program and relationship to M&E Program Logic: inputs/activities/outputs/outcomes/ impacts Monitoring Plan Evaluation Plan Data Collection and Analysis Strategy Reporting Strategy Implementation Strategy Strategy for Learning and Reflection Recommendations Data Collection formats

M&E Framework as a Process

Who is Involved in the M&E Process? Three broad groups of stakeholders: Funders, Policy Makers and Senior Management Staff Practitioners or community members who implement the program, and Service users, beneficiaries or clients and their representatives

Ladder of Participation

Participatory M&E

Developing the M&E Framework in Collaboration: 5 Possible Steps  Developing a Program Logic: What it is intended that the program achieve  Developing Evaluation Questions: What is to be known about how the program operates  Producing a Monitoring Plan: How to answer evaluation questions through monitoring processes  Producing an Evaluation Plan: How to answer evaluation questions through formative and summarive evaluation activities  Developing an Evaluation Rubric: How to identify success/lack of success when implementing the evaluation

Step 1: Developing Program Logic What is the rationale and intent of our program- what has it been set up to achieve? What is its theory of change? How can we display this intent using a Program Logic approach?

Step 2: Developing Evaluation Questions What are the key questions that we want answered in relation to our program? What do we want to know for learning? What do we need to know for accountability to funders and others?

Step 3: Identifying Monitoring Data available and Data Gaps/Data needed What monitoring data do we routinely collect that can be used to answer some of these questions? What additional monitoring data do we need to collect to answer the questions? What monitoring data do we need to collect for accountability purposes but not for M&E?

Step 4: Including Evaluation? What data should we aim to collect through planning for and undertaking periodic formative and summative evaluations? When should such evaluations take place during the life of the program? What funding do we have to undertake formal evaluations and should they be internal/external?

Step 5: Evaluation Rubrics Establishing performance ‘standards’ (definitions of what constitutes ‘excellent’, ‘adequate’, and ‘poor’ performance against identified criteria), applying these standards to the data to draw conclusions about performance quality, success and value as well as the underpinning mechanisms and context for the outcomes (Davidson, J. 2005).

An Evaluation Rubric

Facilitation Skills Structure and time-management Moving toward achieving a clear and tangible outcome from the workshops & meetings Elicit input wherever possible from all parties Keep focus and direction Set up a process for additional feedback and input beyond meetings to keep process going Circulate Drafts for comments Ensure the process is understood as a COLLABORATIVE VENTURE

Conclusion Development of Monitoring and Evaluation Frameworks is more than a written product. It is the culmination of a collaborative process. There should be sufficient opportunity for engaging with key program stakeholders who will implement the M&E Framework. This approach will increase the likelihood that the M&E Framework will be implemented and used to guide reflection and learning.