Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.

Slides:



Advertisements
Similar presentations
Early Reading and Phonics
Advertisements

PQF Induction: Small group delivery or 1-1 session.
BUILDING THE CAPACITY OF BUSINESS MEMBERSHIP ORGANISATIONS Andrei Mikhnev World Bank Group.
Thematic evaluation on the contribution of UN Women to increasing women’s leadership and participation in Peace and Security and in Humanitarian Response.
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
Working with you for Better Health Family Nurse Partnership Jayne Snell Family Nurse Supervisor Clare Brackenbury Family Nurse.
Theory of Change, Impact Monitoring, and Most Significant Change EWB-UK Away Weekend – March 23, 2013.
October 2012 Michael Field Learning to Achieve Results Market Assistance Programme, Kenya.
Etc Objective: The pupils I teach will achieve their agreed individual targets this year and 50% will exceed their targets Timelines and QA Identification.
Results-Based Management: Logical Framework Approach
RBM in the context of Operations and Programme and Project Management Material of the Technical Assistance Unit (TAU)
INVESTORS IN PEOPLE CORPORATE ASSESSMENT DURHAM COUNTY COUNCIL.
Essential Elements in Implementing and Monitoring Quality RtI Procedures Rose Dymacek & Edward Daly Nebraska Department of Education University of Nebraska-
Questions from a patient or carer perspective
Our Focus On Benefits Realisation >> Delivering Accelerated and Sustainable Business Benefits An introduction to our Project Definition & Benefits Templates.
ADB/ ECA/ PARIS21 – NSDS design seminar, Addis Ababa, 8-11 August 2005 National Strategies for the Development of Statistics: Making the Case, NSDS Essentials.
Accessibility Planning, Training & Advisory Programme Making the connections—making it happen Putting Accessibility Planning withinreach! Derek Palmer.
Network of School Planners in Ireland Mark Fennell 28 th April 2012 Implementing effective changes to improve student learning:
Lucy Akhtar, Children, Young People and Families Communities and Tackling Poverty Welsh Government Family Support– Welsh Government Perspective.
The role of governance in self-assessment NATSPEC conference Sue Preece HMI March
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Evidence-based policies and indicator systems 2006 Conducting effective research and analysis to support policy delivery. The Green Book: Appraisal and.
Evaluation. HPS is a “change” process that takes place within a school community A key consideration is that the change needs to be sustainable.
The Impact of Health Coaching
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Change in complex organisations – integrating support Networked Learning to enable SBC Inverness 6 th April 2009 Dr. Guro Huby School of Health in Social.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
BSBPMG505A Manage Project Quality Manage Project Quality Project Quality Processes Diploma of Project Management Qualification Code BSB51507 Unit.
Brighter Futures Programme Cheryl Hopkins Independent Consultant.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Children and Young Peoples’ Participation. Increasingly recognised as a mark of a quality service Belief that this is how ‘transformational change’ can.
TANZANIA’S MONITORING SYSTEM: CHALLENGES AND WAY FORWARD BY EKINGO MAGEMBE POVERTY MONITORING OFFICER (MoF-TANZANIA )
Joint Reviews of Local Authority Social Services JOINT REVIEW OF SALFORD COUNCIL 17 th June 2003.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Developing a Sustainable Procurement Policy and Strategy EAUC – EAF Programme.
Monitoring and Evaluation
UNITAR SEMINAR – February 22, 2012 Paul Balogun- Consultant
Birmingham Smart City Commission Nikki Spencer – Digital Projects Manager, Digital Birmingham Celebrating Research and Partnership Working – Thursday 15.
Name of Pilot Project: Developing pedagogy through collaboration, action research and reflection. Aim of the Project: to set up a collaborative partnership.
Online Learning Module: Planning evaluation. Webinar overview  This webinar will be recorded so that it can be available on the Centre’s website as an.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
1 Module 1 Introduction: The Role of Gender in Monitoring and Evaluation.
Anastasia Muiti, NEMA Monitoring of adopt a river project.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Performance Enabling – Engagement & Cultural Change.
The benefits of providing an effective programme and project support function Whied Latif Andrew Platt.
2 CLLD Programme Devon SIP Briefing Rebecca Cosgrave Ruth Dreher.
Building Capacity and Culture within a Research & Evaluation Team anzea July 2007 Research & Evaluation Team Research Division.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Staff development for technology- enhanced learning and assessment an institutional overview Carol Russell.
Torbay Council Partnerships Review August PricewaterhouseCoopers LLP Date Page 2 Torbay Council Partnerships Background The Audit Commission defines.
Dr. Anubha Rajesh, Early Education Services, ICF January 13, 2016 Invest in Young Children: National Conference on Mother- tongue Based Multilingual Early.
New Economy Breakfast Seminar – 13 July What Has Changed?
Roland Gilbert BSc MRICS; Prince 2 Practitioner
MODULE 15 – ORGANISATIONAL COMMUNICATION
2 Seas Monitoring and Evaluation approach INTERACT seminar – Evaluation Plan workshop Vienna, 12 November 2015.
Workshop to develop theories of change
Evaluation : goals and principles
UK Link Programme Update to PNUNC August 13th, 2013
Governance and leadership roles for equality and diversity in Colleges
5th Capitalization Meeting of the EU Land Governance Programme :
“Effective professional development for teachers is a core
CATHCA National Conference 2018
Integrating Climate Change into Development Programming – Tracking, Measuring & Learning for Adaptive Management Issues and an analytical framework for.
Overview What is evaluation? Why an evaluation framework?
Standard for Teachers’ Professional Development July 2016
Portfolio, Programme and Project
OGB Partner Advocacy Workshop 18th & 19th March 2010
Presentation transcript:

Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement 21 st March 2013 Kerstin Junge, Tavistock Institute of Human Relations

Introduction Why is it useful to think of replication in continuous improvement terms? How can we use monitoring and evaluation as tools for continuous improvement in the replication process? How do these tools affect impact of the replication process? Conclusions

Why is it useful to think of replication in continuous improvement terms?

How can we use monitoring and evaluation as tools for continuous improvement in the replication process?

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Replication Phase 1: Identifying innovations for replication and deciding to adopt Requirement to engage with existing impact evidence as integral part of programme design

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Replication Phase 1: Identifying innovations for replication and deciding to adopt Requirement to engage with existing impact evidence as integral part of programme design “The application stage forced me to ask the… team: why are we doing it in this way? How do we know it works? It was challenging to ask the organisation. (…) These are hard questions. Realising Ambition clarified the thinking in the organisation.” “The standards of evidence exercises in the application forms (…) helped us to really get to know [the intervention). They helped us test [it] in a systematic way to see if we wanted to use it. It helped me to look forward and see what the evidence said. (…) the evidence was stronger when teachers delivered the intervention rather than external people so we are using teachers.” Application process challenged us, (…) it asked for some quite robust information and we had to organise ourselves to get that. [It] challenged us to smarten up back end, e.g. pulling the logic model together; we knew it and played around with doing that and had not expressed that in a document. (…) we actually said even if we don’t get that, [the application process] actually helped us organise ourselves (…).”

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Replication Phase 1: Identifying innovations for replication and deciding to adopt Requirement to engage with existing impact evidence as integral part of programme design “The standards of evidence exercises in the application forms (…) helped us to really get to know [the intervention). They helped us test [it] in a systematic way to see if we wanted to use it. It helped me to look forward and see what the evidence said. (…) the evidence was stronger when teachers delivered the intervention rather than external people so we are using teachers.” “The application stage forced me to ask the… team: why are we doing it in this way? How do we know it works? It was challenging to ask the organisation. (…) These are hard questions. Realising Ambition clarified the thinking in the organisation.” Application process challenged us, (…) it asked for some quite robust information and we had to organise ourselves to get that. [It] challenged us to smarten up back end, e.g. pulling the logic model together; we knew it and played around with doing that and had not expressed that in a document. (…) we actually said even if we don’t get that, [the application process] actually helped us organise ourselves (…).” Creating preconditions for impact through: better and explicit knowledge of the intervention and effective delivery, clearer understanding of causal pathways, preparing the organisation for replication Embeds evidence based and reflective practice in programme community

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Process evaluation Sense-making and capturing of lessons learnt to support future replication investments Analysing programme architecture, programme design and application process Process evaluation Sense-making and capturing of lessons learnt to support future replication investments Analysing programme architecture, programme design and application process Replication Phase 1: Identifying innovations for replication and deciding to adopt Requirement to engage with existing impact evidence as integral part of programme design “The standards of evidence exercises in the application forms (…) helped us to really get to know [the intervention). They helped us test [it] in a systematic way to see if we wanted to use it. It helped me to look forward and see what the evidence said. (…) the evidence was stronger when teachers delivered the intervention rather than external people so we are using teachers.” “The application stage forced me to ask the… team: why are we doing it in this way? How do we know it works? It was challenging to ask the organisation. (…) These are hard questions. Realising Ambition clarified the thinking in the organisation.” Application process challenged us, (…) it asked for some quite robust information and we had to organise ourselves to get that. [It] challenged us to smarten up back end, e.g. pulling the logic model together; we knew it and played around with doing that and had not expressed that in a document. (…) we actually said even if we don’t get that, [the application process] actually helped us organise ourselves (…).” Creating preconditions for impact through: better and explicit knowledge of the intervention and effective delivery, clearer understanding of causal pathways, preparing the organisation for replication Embeds evidence based and reflective practice in programme community

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Replication phase 2: first wave implementation Monitoring system Keeping programme on track, understanding challenges, improving delivery

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Replication phase 2: first wave implementation Preparation for (new) impact evaluations Monitoring system “One of the most significant things was (…) looking at how we constructed the logic model. (…) Now it's much easier to be specific.” Webinars “(…) spark thinking which can feed into plans and keep them on track.” “Some of the work that is being on theory of change; understanding replication and fidelity. What [intervention owners] do is set up the programme, train the staff and give us assistance. But they don’t engage staff in discussing in understanding programme fidelity and replication. “ Keeping programme on track, understanding challenges, improving delivery

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Replication phase 2: first wave implementation Preparation for (new) impact evaluations Monitoring system Probability of achieving desired impact increases through: greater specificity of intervention and expected outcomes as well as improved understanding the importance of fidelity in delivery. “One of the most significant things was (…) looking at how we constructed the logic model. (…) Now it's much easier to be specific.” Webinars “(…) spark thinking which can feed into plans and keep them on track.” “Some of the work that is being on theory of change; understanding replication and fidelity. What [intervention owners] do is set up the programme, train the staff and give us assistance. But they don’t engage staff in discussing in understanding programme fidelity and replication. “ Keeping programme on track, understanding challenges, improving delivery Offers data, learning and knowledge to guide and improve replication delivery

Monitoring and evaluation for continuous improvement in the replication process: the case of Realising Ambition Replication phase 2: first wave implementation Process evaluation Sense-making and capturing of lessons learnt to support future replication investments Identification and (early) definition of replication models; overview of types and benefits of impact evaluation and other support ; replication progress indicators Process evaluation Sense-making and capturing of lessons learnt to support future replication investments Identification and (early) definition of replication models; overview of types and benefits of impact evaluation and other support ; replication progress indicators Preparation for (new) impact evaluations Monitoring system Probability of achieving desired impact increases through: greater specificity of intervention and expected outcomes as well as improved understanding the importance of fidelity in delivery. “One of the most significant things was (…) looking at how we constructed the logic model. (…) Now it's much easier to be specific.” Webinars “(…) spark thinking which can feed into plans and keep them on track.” “Some of the work that is being on theory of change; understanding replication and fidelity. What [intervention owners] do is set up the programme, train the staff and give us assistance. But they don’t engage staff in discussing in understanding programme fidelity and replication. “ Keeping programme on track, understanding challenges, improving delivery Offers data, learning and knowledge to guide and improve replication delivery

Effects on impact Creating preconditions for impact: – Better and explicit knowledge of the intervention and effective delivery – Clearer understanding of causal pathways – Preparing the organisation for replication Greater probability of achieving desired impact: – Greater specificity of intervention and expected outcomes – Improved understanding the importance of fidelity in delivery – Data, knowledge and learning to improve replication delivery

Conclusions Understanding replication as a continuous improvement process offers a ‘practical’ framework to guide activities, as well as tools Monitoring and evaluation tools should be an integral part of the replication process from the start, both at project level and at programme / policy level – Creates a structure to support constructive engagement with evidence, evidence based practice, learning from experience – And through this supports learning and continuous improvement It means understanding monitoring and evaluation at least as much about learning as about ‘auditing’ Possibility of not only ‘single loop learning’ but also ‘double loop learning’ (Argyris and Schön)? – Not just continuous improvement of implementation (revisiting ‘action strategies’) – But also revisiting values [mental models, beliefs, intentions] underpinning replication actions and strategies (‘governing variables’)