Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven.

Slides:



Advertisements
Similar presentations
Il Project Cycle Management :A Technical Guide The Logical Framework Approach 1 1.
Advertisements

Key Stage 3 National Strategy
Introduction to Impact Assessment
PQF Induction: Small group delivery or 1-1 session.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Project Monitoring Evaluation and Assessment
Ray C. Rist The World Bank Washington, D.C.
Results-Based Management: Logical Framework Approach
Supporting people with a learning disability Introduction to Project Management Presenter: Steve Raw FInstLM, FCMI.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Project Cycle Management (PCM)
Challenge Questions How well do we meet the need of our stakeholders?
 A New School System A Guide for Parents and Carers.
CASE STUDIES IN PROJECT MANAGEMENT
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Unit 2: Managing the development of self and others Life Science and Chemical Science Professionals Higher Apprenticeships Unit 2 Managing the development.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Nine steps of a good project planning
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Developing a result-oriented Operational Plan Training
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
Developing Indicators
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Commissioning Self Analysis and Planning Exercise activity sheets.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Participatory Planning Project Cycle Management (PCM)
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Technology Enhanced Learning at University - How can learning enhancement be demonstrated? Adrian Kirkwood & Linda Price IET, The Open University.
Developing a Sustainable Procurement Policy and Strategy EAUC – EAF Programme.
Monitoring and Evaluation
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Advanced Engineering Projects Management Dr. Nabil I El Sawalhi Associate Professor of Construction Management 1AEPM 4.
Mindset 2000 LtdSlide 1 Train to Gain Provider Support Programme October 2007 Self assessment - introduction.
Evaluation design and implementation Puja Myles
Support the spread of “good practice” in generating, managing, analysing and communicating spatial information Evaluating and Reflecting on the Map-making.
The Implementation of BPR Pertemuan 9 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Curriculum Futures Looking after learners, today and tomorrow To develop a modern world-class curriculum that will inspire and challenge all learners and.
CAREER AND SKILLS TRAINING STRATEGIC FRAMEWORK Planning is key to success.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Evaluation What is evaluation?
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
WORKSHOP ON PROJECT CYCLE MANAGEMENT (PCM) Bruxelles 22 – 24 May 2013 Workshop supported by TAIEX.
Building an ENI CBC project
What is Advocacy? ]thepressuregroup[.
Session VII: Formulation of Monitoring and Evaluation Plan
BSBWOR301 Organise personal work priorities and development
Training Trainers and Educators Unit 8 – How to Evaluate
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Training Trainers and Educators Unit 8 – How to Evaluate
Results of the Organizational Performance
Session 9 Recap on LFM and IL.
CATHCA National Conference 2018
OGB Partner Advocacy Workshop 18th & 19th March 2010
An Introduction to LiFE
Integrating Gender into Rural Development M&E in Projects and Programs
How is an M & E framework derived from the logframe?
Presentation transcript:

Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

The Project Cycle Carry out the projects Monitor Summative evaluation Plan projects – and plan to monitor and evaluate them Outside Stakeholders ET Strategy Formative evaluation

Levels 1. Your projects 2. Your programme 3. The whole Initiative

Monitoring and evaluating your projects

Monitoring The capture of data about the project regularly or continuously, usually in a consistent way. Monitoring data can be expressed in numbers or concise narrative.

Evaluation The all-round assessment of the performance of a project or programme Uses data from monitoring plus those captured during the evaluation itself, e.g. qualitative interviews with key stakeholders

Uses of monitoring Provides up-to-date feedback on performance of the project. Are we on track with: Inputs (use of resources)? Activities? Outputs? Short-term outcomes? Contributes data to evaluations.

Uses of evaluation Improvement - Are we doing the things right? Wider learning – What difference did we make? Did we do the right things? Accountability – Did we do what stakeholders expected? Advocacy – Look what we can do!

M & E M&E In human development, where results are often unpredictable, monitoring and evaluation are tending to converge. Monitoring should look beyond planned results Evaluation should be a regular, timely, process.

Planning your M&E – the important questions Why? For whom? What? How? When? Where?

Planning your M&E – why, for whom and what Be clear why you are doing it – what use you and others will make of it. Who are the main stakeholders? What do they need to know? In this light, and bearing in mind the resources available, decide what you should M&E and on what scale. Don’t plan to M&E anything that isn’t important to know about!

WHAT to M&E Outcomes Outputs Activities Inputs

WHAT to M&E Outcomes = the changes the intervention helped to bring about (better educated students, more empowered academic staff) – both planned and unplanned Outputs = the immediate, planned, results of the intervention (technicians and academic staff trained, courses re-engineered and launched on the LMS)

WHAT to M&E Activities = the things you did to ensure you delivered the outputs (identify and contract trainers; research the software options and procure) Inputs = the resources you used in your activities

Kirkpatrick’s 4 levels for evaluating training Reactive: how they felt about the training Cognitive/affective: new knowledge, skills, ideas, attitudes Behavioural: doing new things, doing things differently Organisational/multiplier: effect on the organisation; diffusing benefits to others

Planning M&E - indicators Decide whether it would be helpful to develop indicators for these elements (inputs, activities, outputs, outcomes). [The ETS LogFrame requires indicators for outputs] Indicators are pre-defined, precise, pointers that help you assess performance against the background of the planned inputs, activities, outputs and outcomes.

Planning M&E - indicators Examples: Proportion of activities in the project plan completed on time Proportion of trainees who report that the training either met or exceeded their expectations Numbers of students using the new LMS applications Date when ETS was formally adopted by the University XYZ Board

Planning M&E – indicators/targets Indicators can be neutral as in the previous slide. Or they can be expressed as targets if you are confident the targets are appropriate students using the new LMS applications within 3 months of their launch ETS formally adopted by the XYZ Board by October 2010

Planning M&E - indicators Be careful not to let indicators narrow your perspective. Outcomes in particular are often difficult to predict in advance. Look for unplanned effects of your interventions.

A few words about baselines and attribution You have a qualitative picture of where your institution stands in ET4TL on the verge of ETI Part B. But if you want to assess with precision the change that your intervention has promoted in a specific area – e.g. a change in attitudes or usage of ICT among a particular group - you may need to capture data on the baseline. This research must obviously be built into your project plan and implemented before the project gets moving.

A few more words about baselines and attribution Trying to measure some sorts of change and attribute it to your intervention can be difficult. The before and after groups may not be comparable – e.g. different student cohorts - and change may be influenced by extraneous factors.

Planning M&E - data Work out HOW you are going to capture the data on inputs, activities, outputs and outcomes – including baseline data. Bear in mind the cost, time and access issues in capturing the data. Don’t try to do too much. Use samples if appropriate. Work out how you are going to process and analyse the data.

Project Logical Framework – optional except for outputs DescriptorsPerformance indicators Sources of verification Risks/ assumptions Long-term Outcomes Should relate to ETS LogFrame OutputsKey output relates to ETS LogFrame Key indicators relate to ETS LogFrame Key sources relate to ETS LogFrame Key risks/ assumptions relate to ETS LogFrame Activities Inputs

Questions to answer in a summative evaluation Effectiveness (have we fulfilled the project objectives?) Efficiency (have the resources – including time – been used optimally?)

Questions to answer with a summative evaluation Impact (what difference have we made - intentionally or unintentionally?) Sustainability (are the positive changes likely to last?) Relevance of project (were we doing the right things?)

The Project Cycle Carry out the projects Monitor Summative evaluation Plan projects – and plan to monitor and evaluate them Outside Stakeholders ET Strategy Formative evaluation

Reporting on your projects The MOA requires six monthly reporting on projects: Progress towards results and indicators A summary of problems and challenges experienced Revised activity schedules for each project A budget variance report.

Reviewing projects You will want to meet more frequently than that to review your projects. You will want to keep a note of what you conclude to feed into the six-monthly MOA reporting. Your regular review of projects will inform a six-monthly programme-wide self- assessment.

Reviewing your projects We suggest you ask these standard questions, among others, at your project review: Have the activities and products been completed according to plan - in terms of both timing and quality? If not, why not? What have you done to address completion and quality challenges? What benefits is the project producing – for people and the institution as a whole? Are there any negative effects, if so what are they? What is being done to mitigate any negative effects?

Monitoring and evaluating your programme

The self-assessment process Meeting of ETI core team plus project leaders every 6 months 4-6 hours Report in full, with 1-2 page summary for MOA requirement (MOA also requires six- monthly reporting on projects, as we have seen).

Self-assessment at programme level How is the overall ETI progressing? What are the main changes/outcomes that have taken place for people and the institution as a whole – both positive and negative - as a result of the ETI? Are there any outcomes/changes that you were expecting by now, but which have not taken place? If so, why do you think they haven’t taken place? How useful has your ET Strategy been in this period? What specifically has it helped with?

Self-assessment at programme level How is the overall ETI progressing? What aspects of your team’s work have been most constructive and productive? Are there aspects of your team’s work that have not worked well? If so, what are the probable reasons? In what ways has the wider institution supported progress in the ETI? Are there aspects of the wider institution that have hindered progress?

Self-assessment at programme level How is the overall ETI progressing? What has been helpful to you in the work of the SAIDE-CET team and the external evaluator What has not been helpful? What other support could they have given you that would have been helpful?

ETS Logical Framework Vision DescriptorsPerformance indicators Means of verification Assumptions/ risks Long-term Outcomes Project outcomes should relate to them ET Strategy Outputs Key outputs from project plans Defined in project plans