Evaluation Emma King.

Slides:



Advertisements
Similar presentations
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Dr Sue Horner Head of Standards and Assessment Policy Qualifications and Curriculum Authority UCET, November 2008 A new conversation about assessment.
Supporting managers: assessment and the learner journey
Evaluation What, How and Why Bother?.
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
Fit to Learn Using the Employability Skills Framework to improve your performance at College The Employability Skills Framework has been developed by business.
Evaluating Teaching and Learning Linda Carey Centre for Educational Development Queen’s University Belfast 1.
Evaluation.
Grade 12 Subject Specific Ministry Training Sessions
Measuring Learning Outcomes Evaluation
INTRODUCTION.- PROGRAM EVALUATION
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Being a Senco!. What is the core purpose of being a Senco?
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Designing effective self marketing tools
Portfolio based assessment - options for the new CGEA.
Understanding Meaning and Importance of Competency Based Assessment
Fundamental Skills The skills needed as a base for further development You will be better prepared to progress in the world of work when you can: Manage.
Cleo Sgouropoulou * Educational Technology & Didactics of Informatics Educational Technology & Didactics of Informatics Instructional.
Professional Certification Professional Certification October 11, 2007 Standard: Effective Teaching Criteria 1(b) Using a variety of assessment strategies.
What Ofsted does to reach a judgement about teaching and what can be learned from this by academies?
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Practical Definition of Terms Output Targets What do we want to achieve? Deliverables What do we get? Output Targets – To improve the public speaking skills.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Program Evaluation DR. MAJED WADI. Objectives  Design necessary parameters used for program evaluation  Accept different views of program evaluation.
PERSONAL DEVELOPMENT PLANNING Helping to set goals and reach potential 1 The Lloyds Bank Foundation is committed to providing this information in a way.
TERM 2. SESSION 3. EVALUATION AND ASSESSMENT – summative and formative THE REFLECTIVE PRACTITIONER THE CUSTOMER’S PERSPECTIVE.
KEVIN SMITH & KIM HORTON JULY 2015 Educational research and teaching Wales.
Evaluating Training The Kirkpatrick Model.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
TEACHING STYLES TEACHING STYLES. LEARNING OUTCOMES To examine different teaching styles To evaluate how teaching styles can affect performance To begin.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
How to evaluate the impact of CPD? 28 th April 2016.
Subject specialist mentoring on the DET
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Staff meeting Monday 20th February 2017
Research Methodologies
Support for English, maths and ESOL Module 5 Integrating English, maths and ICT into apprenticeship programmes.
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Training Trainers and Educators Unit 8 – How to Evaluate
Delivery of Training & Monitoring & Evaluation
Project No EPP EL-EPPKA2-CBHE-JP ( ) UES TEAM
The Skills for Success in Mechanical Engineering
Assessment and Feedback – Module 1
NQT Mentor and Tutor Seminar
Programme Review Dhaya Naidoo Director: Quality Promotion
Senior Management Leadership Programme Review and next steps
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Understanding by Design
Evaluation Styles Logic Model Generic Social Developmental
New Zealand Dietitians Board
Marking Marketplace You must have 5+ student books from the same clas.
Training Trainers and Educators Unit 8 – How to Evaluate
Designing Assessment Things to be considered:
Ed 11: Beginning Field Experience
GETTING ‘EVEN’ BETTER: FIVE LEVELS FOR PD EVALUATION
General Notes Presentation length - 10 – 15 MINUTES
Introduction to CPD Quality Assurance
Final Candidate Interviews and On-Campus Visits
Introduction to Student Achievement Objectives
Train the Trainer Your Name.
Education Theory and Practice B ETP520S
Learning that deepens knowledge and understanding
Assessment Methods.
Linking Evaluation to Coaching and Mentoring Models
Why do we assess?.
OTLA Report Writing Training
Times they are changing: Evaluating outreach in a new era
Presentation transcript:

Evaluation Emma King

Evaluation is not at heart about collecting evidence to justify oneself, nor about measuring the relative worth of courses and teachers. It is about coming to understand teaching in order to improve student learning. Ramsden, 1992 p 241

Learning Outcomes By the end of the session you should be able to: Identify the purpose for evaluation Select appropriate evaluation strategies for your own context Recognise what makes effective an survey or focus group Discuss how you might make use of evaluation data

Evaluation Process Value Judgement Measurement Monitoring Action Butcher, C., Davies, C. and Highton, M. (2006, p189)

Evaluation Process TEACH FEEDBACK & REFLECT THINK PLAN Kolb (1984)

Planning an Evaluation Identify the purpose of the evaluation Identify and give priority to the constraints under which the evaluation will take place Plan the possibilities for an evaluation within these constraints Agree an evaluation design

Considerations WHY do you want to evaluate? WHAT do you want to evaluate? WHEN is the best time to conduct the evaluation? WHO will you get evidence from, and who is it for? HOW will you do it and how will you act on the outcome?

Why? To improve practice To ensure student needs are being met To demonstrate the value of the service and secure future budget To determine if we are meeting agreed quality standards and targets

What? Objectives model Are objectives being met? Experimental Is the innovation working? Decision making Which of the two strategies is most suitable? Illuminative What’s going on? Morrison (1993)

When? Diagnostic Check the needs of a particular student cohort Formative Find out how the current cohort are getting on Summative Inform the next iteration of support

Who? Self Students – current students and alumni Colleagues – within the department and beyond Manager

Kirkpatrick’s Four Levels of Evaluation Level 1 Reaction how do the target audience respond? Level 2 Learning what attitudes, knowledge and skills are acquired? Level 3 Behaviour to what extent do participants apply what they have learnt? Level 4 Results the impact of the training on the organisation or environment?

Methods Considerations Focus Groups Questionnaires Semi-structured interviews Observation Practicality Objectivity Reliability Validity

Survey Questions How old are you? 15-20 20-30 30-40 40-50 50-60 How effective was that session? Was the support offered timely, appropriate and making use of appropriate resources? Would your student group respond well to... ?

Writing good survey questions Only ask questions directly related to what you’re evaluating Keep the language clear and unambiguous Vary the format of questions to encourage students to think about their responses Order your questions logically Avoid yes/no questions Remember you can use an even number of options to encourage students to make a decision

Interviews and Focus Groups Consider how you will record the data Design questions to elicit reflection and allow participants to express their own point of view Ask questions which will encourage group interaction and discussion Consider how you will manage the group discussion Have a strategy ready for if the conversation goes off at a tangent

Interpreting Feedback What does it tell us? What weight should we give to different questions? How do we respond? Who’s responsibility is it to respond?

Closing the Loop After an evaluation make sure that you feedback to all involved about how you’ve made use of the data and what’s happening as a result

Pathway approach - IMPACT levels (levels 0 - 4) Cause Effect Researcher Training & Dev. Level 1 Reaction Level 2 Learning Level 3 Behaviour Results Level 4 Outcomes Level 0 Foundations Input OUTPUTS Benefits OUTCOMES Lead to

IMPACT Level 4 - Potential outcomes - Final results of the training and development activity Complexity – number of factors economic growth personal transformation B D social & cultural capital qualification rates skilled workforce Better quality research A C researcher engagement in training spin-outs and start-ups by trained researchers Time after training & development activity

IMPACT Level 4 - Potential outcomes - Final results of the career development activity Complexity – number of factors graduate gets job Career management skills Career adaptability Economic growth Student takes action B D social & cultural capital student engagement in career development activity A C Time after career development activity