X4L Prog Mtg – March 05 Evaluation – What works best - when and why? Prof Mark Stiles Project Director: SURF X4L, SURF WBL, SUNIWE and ICIER Projects Head.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Bridging the gap between good practice principles and research study realities. Using case studies to build descriptors of the public involvement role.
Measuring Impact – Making a Difference Carol Candler – NRF Graeme Oram – Five Lamps.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
Project Monitoring Evaluation and Assessment
Scrutiny Scrutiny is a major tool for evaluating and then effecting change. Reviewing and evaluating what is done and measuring its success is key to better.
Chapter 4 How to Observe Children
Social Accounting and Audit (SAA) and the Social Audit Network An introduction…
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Assessing the Engaged Student: Using Evaluation Tools to Reexamine and Strengthen Civic Engagement Programs Ethan A. Kolek, Associate Director of Institutional.
Designing and Implementing Learning Technology Projects – A Planned Approach Professor Mark Stiles and Dr Jenny Yorke Staffordshire University EFFECTS/ELT.
Title I Needs Assessment and Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
Learning and Development Developing leaders and managers
Embedding information literacy into the curriculum - How do we know how well we are doing? Katharine Reedy (Open University) Cathie Jackson (Cardiff University)
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Planning and submitting a shadow report Charlotte Gage Women’s Resource Centre.
Impact assessment framework
Portfolio based assessment - options for the new CGEA.
Topic 4 How organisations promote quality care Codes of Practice
The Evaluation Plan.
Professional Certificate – Managing Public Accounts Committees Ian “Ren” Rennie.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
August 2007FFP Testing and Evaluation Techniques Chapter 7 Florida State Fire College Ocala, Florida.
Research Methods in Education
APPLICATION FORM OF ROBINWOOD SUBPROJECT SECOND STEP 1. The short listed Local Beneficiaries work together to create international partnerships and prepare.
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
Enquiring into Entrepreneurial School Leadership Sue Robson.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Evaluating and measuring impact in career development: extension workshop Presented by – Date – Just to identify strengths and areas to improve are no.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Looking at our School—LAOS School Development Planning Initiative.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
An Introduction to Formative Assessment as a useful support for teaching and learning.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
Evaluating your EQUIP Initiative Helen King. Objectives To enable teams to develop a shared understanding of the purpose, use and stakeholders for evaluation;
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
July 2007 National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Role of Action Planning in The Developmental.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
Fashion MARKETING TID1131. Types of research Quantitative research Information relating to numbers – quantity. Method - surveys Qualitative research To.
Monitoring and Evaluation in the GMS Learning Program 7 – 18 May 2012, Mekong Institute, Khon Kaen, Thailand Randy S. Balaoro, CE, MM, PMP Data Collection.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Session 2: Developing a Comprehensive M&E Work Plan.
The matrix Standard Beth Cummings Quality Manager – matrix Standard.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Measuring Institutional Capacity for Sustainability Mark D. Bardini, Ph.D. Chemonics International AEA Webinar September 15, 2011.
Click to edit Master subtitle style Competence by Design (CBD) Foundations of Assessment.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
School – Based Assessment – Framework
Learning and Development Developing leaders and managers
Learning and Development Developing leaders and managers
Safety Culture Self-Assessment Methodology
TECHNOLOGY ASSESSMENT
Reading Paper discussion – Week 4
Presentation transcript:

X4L Prog Mtg – March 05 Evaluation – What works best - when and why? Prof Mark Stiles Project Director: SURF X4L, SURF WBL, SUNIWE and ICIER Projects Head of Learning Development and Innovation Staffordshire University

X4L Prog Mtg – March 05 Two thoughts Thought One: Most things in Education are changeable, context specific and difficult to define Thought Two Business Processes are always less well defined than you think

X4L Prog Mtg – March 05 A Project Approach Evaluation must be embedded in the project You need to understand your Plan and Goals and be ready for change! You need suitable tools to manage and evaluate

X4L Prog Mtg – March 05 Narrative Summary (Description) Verifiable IndicatorsMeans of VerificationAssumptions Goal Statement of Intention The Expected Impact on the Service or Institution Information needed to determine progress If poss. unit of measurement, quantity, quality and timing given Source of Indicator How information will be collected, by whom and frequency Purpose What the project expects to achieve in terms of development outcome IndicatorsMOVsAssumptions - what needs to be true Risks – what mustnt be true Component Objectives One per project output/activity area IndicatorsMOVsAssumptions /Risks Outcomes/ outputs For each CO – specific results and tangible products IndicatorsMOVsAssumptions /Risks Activities (optional) Outline of specific tasks to achieve required outputs MilestonesManagement reports on physical/financial progress Assumptions /Risks

X4L Prog Mtg – March 05 Evaluation of Outputs and Outcomes Needs to be user centred Needs to be formative and summative Needs to involve stakeholders Formative evaluation should: –Inform development –Inform wider community

X4L Prog Mtg – March 05 Evaluation – the basics What do you want to know about? Who are the stakeholders? What information can they provide? How will the information be obtained? How will you find out about things you are not expecting?

X4L Prog Mtg – March 05 Indicators Related to what is to be changed – has change occurred? Usually quantifiable and could also be qualitative Can the indicator be produced using available resources? Is the indicator useful/meaningful to the people who will use it? Will you find out in time to allow CHANGE to the conduct of the project

X4L Prog Mtg – March 05 Means of Verification Useful questions to ask: –How will the information be obtained? –From where? –By whom? –When? –In what form?

X4L Prog Mtg – March 05 A User-Centred Approach Think of some broad questions Who are the major & minor stakeholders? Who can actually provide the information – data source? Specific questions for each data source Instruments How will results feedback and be disseminated? See LTSN ELT015

X4L Prog Mtg – March 05 Evaluation Template

X4L Prog Mtg – March 05 Instruments Interviews Focus Groups Observations Surveys/Questionnaires Statistics Logs/Histories See:

X4L Prog Mtg – March 05 More About Instruments Interviews Depth of information Unexpected findings Fairly quick but intensive Need right people People lie! (and what they say is contextual) Thanks to Sarah Agarwal of ILRT, Bristol

X4L Prog Mtg – March 05 More About Instruments Focus Groups Wide ranging Unexpected findings Intensive preparation but analysis quicker Quick Individuals can be problem Findings contextual Getting group together a problem Need right people Need clear scope Takes skill

X4L Prog Mtg – March 05 More About Instruments Surveys Useful for finding out about priorities and reactions to service Can be to quick to analyze Preparation considerable Get returns from a representative sample hard Lack depth Takes skill Can be used as a con Usually poorly done

X4L Prog Mtg – March 05 More About Instruments Statistics/Logs/Histories Record everything! Develop an on-going story/narrative Reflect on the story Draw conclusions Change the plan!

X4L Prog Mtg – March 05 More About Instruments Observations Record everything! Contribute to the on-going story/narrative Can provide foci for interviews Be clear about what you are observing

X4L Prog Mtg – March 05 What SURF X4L Did (Technical thread not included here) Two separate College partners: Agreed approaches Agreed areas of evaluation Needed to be able to compare and contrast

X4L Prog Mtg – March 05 What SURF X4L wanted to know Capture the whole story Expectations Awareness Motivation and Engagement Use and Pedagogy Tools and Resources Roles Responsibilities Policies Procedures Cultural Change Outputs and Outcomes Evaluation Accessibility Conclusions and Recommendations

X4L Prog Mtg – March 05 The instruments SURF X4L used Record everything – an on-going story Regular reflection Common questionnaires for staff and students Observations Interviews Accessibility review by RNCB of technical and content outputs

X4L Prog Mtg – March 05 The output approach Each College has written up their own story: –Local Context –Reflection against each area –Main good practice and recommendations for each area Main good practice and recommendations from each college being combined and contrasted to produce (with RNCB report): –Guide to reuse and repurposing –Pedagogic, cultural and organisational guide

X4L Prog Mtg – March 05 Thanks