Road Safety ETP Evaluation Training

Slides:



Advertisements
Similar presentations
ETP EVALUATION SCOTLAND WORKSHOPS 16 th MARCH 2011 and 17 th MARCH 2011.
Advertisements

ROAD SAFETY ETP EVALUATION TRAINING
Educational Solutions for Workforce Development PILOT WORKSHOP EVALUATION MARY RICHARDSON MER CONSULTING.
How to Develop the Right Research Questions for Program Evaluation
Effectively applying ISO9001:2000 clauses 5 and 8
Module 1 Introduction to SRL. Aims of the Masterclass Understand the principles of self regulated learning (SRL) and how they apply to GP training Develop.
Key Performance Measures, Evaluation Plans, and Work Plan
1 Classroom management and partnerships Working with other adults in the class.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Workshop 6 - How do you measure Outcomes?
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
Evaluating and measuring impact in career development: extension workshop Presented by – Date – Just to identify strengths and areas to improve are no.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
1 Human Resource Management Training & Development Handout 2.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Developing classroom practice to raise standards Rosemarie Sadler, Primary Performance Adviser Val Phillips – Primary Performance Adviser.
How to evaluate the impact of CPD? 28 th April 2016.
Articulating from FE to HE: Assessing & Improving Academic Confidence Enhancement Themes conference, Thursday 9 June 2016 John McIntyre Conference Centre,
Building an ENI CBC project
ETP Evaluation Scotland Workshops
Evaluating the Quality and Impact of Community Benefit Programs
NEEDS ASSESSMENT HRM560 Sheikh Rahman
Incorporating Evaluation into a Clinical Project
Welcome to Scottish Improvement Skills
Provide instruction.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Module 2 Basic Concepts.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Landpower Project Zero Harm Learning Series Module:. #7a Module Title:
Consider Your Audience
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Training Trainers and Educators Unit 8 – How to Evaluate
BSBWOR301 Organise personal work priorities and development
Factors facilitating academic success: a student perspective
Coursework: The Use of Generic Application Software for Task Solution
ETP Evaluation Scotland Workshops
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Implementing the NHS KSF Action Planning and Surgery Session
CENTER FOR COORDINATION OF RESEARCH INTERNATIONAL FEDERATION OF CATHOLIC UNIVERSITIES  (IFCU) Continuing of Education for Disadvantaged Adolescents in.
Landpower Project Zero Harm Learning Series Module:. #7a Module Title:
Strategic Prevention Framework - Evaluation
Training Trainers and Educators Unit 8 – How to Evaluate
Respect for People March 22, 2018.
Unit 4: Personal Health & Decision Making
Target Setting for Student Progress
Communication plan.
Running litigation surgeries
School’s Cool Makes a Difference!
Scottish Improvement Skills
Measurement for Improvement
Chapter 16 Planning and Management of Health Promotion
Workforce Engagement Survey
LIFE SKILLS.
Monitoring and Evaluation
My Performance Appraisal How to write SMART objectives
Shared Decision Making in Breast Care
The What Works Fund Key learning
Assessment For Learning
NATIONAL RESEARCH FOUNDATION PERFORMANCE MANAGEMENT WORKSHOP
The spirit of the CIVITAS mobility measure evaluation
7. EAFM cycle overview Essential EAFM Date • Place 1.
Travel Rules and Sign-Up Sheet
OGB Partner Advocacy Workshop 18th & 19th March 2010
Ask Us NottinghamShire
Project intervention logic
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Road Safety ETP Evaluation Training

Aim of the day To develop participants’ ability to carry out ETP Evaluation

Intended Learning Outcomes Be familiar with the meaning of specific terms used in evaluation Be able to distinguish between goals, aims and objectives Be able to set appropriate objectives Be able to construct and use a logic model Understand the pros and cons of different evaluation designs Be able to write closed and open questions

Glossary Treasure Hunt Remember that pre-seminar task? Check your answers with the definitions displayed around the room Make sure you introduce yourself to anyone you don’t already know! Before the workshop, attendees were given the terms and definitions exercise to complete. On the day of the workshop all of the terms with their correct definitions are displayed on the walls around the room. Workshop attendees are asked to walk round looking at all of the terms, to check their answers.

The Evaluation Process Planning and Evaluating an intervention – key steps What comes to mind?

7 Key Steps in Planning and Evaluating and Intervention Needs Analysis – 1st data collection stage: What exactly is the problem and What is the best way to solve it? Intervention Planning – What are we going to do and how? (aims and objectives) Monitoring and Evaluation Framework – What, When and How to measure? 2nd data collection stage: base-line data

7 Key Steps in Planning and Evaluating and Intervention 4) Implement Intervention – 3rd data collection stage: collect monitoring data 5) Main data collection - Collect ‘post-intervention’ data. Review the data collected – have what you need? Analyse data 6) Report – Write report and disseminate findings 7) USE – Apply findings: amend intervention as necessary. Use in needs analysis stage (Step 1) of future interventions

The Intervention Evaluation Cycle Identify need, target group & likely ‘best’ intervention (local data, strategy, MAST, previous evaluations) Plan project and evaluation: Set aims & objectives, decide evaluation design and methods Publish and Share results Evaluate processes and/or outcomes Deliver intervention Feed results and lessons learned into future projects

What is the difference... Between an aim and an objective? Between a goal and an aim? RSHC since summer ’08. Mainly in Yorkshire and Humber and South East regions, but also South West, East of England.

Definitions: A Refresher Goal (Very broad): The overall reason behind the intervention – e.g. To reduce road casualties Aim(s) (Specific): What and who the intervention will change – e.g. To reduce the incidence of drink-driving amongst 17-25 year olds Objective(s) (Very specific): Specific outcome the intervention intends to achieve - e.g. To see an increase of 20% in the knowledge of 17-25 year olds about the legal penalties for drink-driving, by September 2011 To reduce mean scores in a self-report survey of drink- driving behaviour completed by 17-25 year olds, by November 2011 (SMART)

Example: Older Drivers Project Aims: To increase self awareness of older drivers To improve driving skills of drivers over the age of 55

Older Drivers Project Objectives (By March 2012): To increase older drivers’ knowledge of the effects of medication on driving ability To improve older drivers’ journey planning To increase self identification of new aspects of their driving which could be improved To increase older drivers’ skills scores in a repeat test drive Needs analysis seminars gave information on: What does evaluation mean to you? What routine data (internal) are available to you? What level of support/training does your organisation have to carry out evaluation? What are the barriers to doing evaluation? Are there clear evaluation training needs which DfT / RoSPA could assist with?

Logic Model Components Inputs Outputs Outcomes Assumptions External factors Aims and objectives

Example Logic Model ASSUMPTIONS: Drivers accept the ADI assessments of their skills. Drivers identify themselves as an ‘older’ driver and see a need to attend the sessions/assessments. That the sessions and assessments will have a positive influence, i.e. Not make drivers unduly cautious or over-confident. EXTERNAL FACTORS: Adverse weather, local news stories involving older drivers, increased insurance premiums, family members, poor health. Outcomes can be Short Term (immediate changes), Intermediate, and Long Term

Evaluation Designs Post only with no comparison group Post only with comparison group Post then pre, no comparison group Pre and post with no comparison group (non experiment) Pre and post with comparison group (quasi experiment) Randomised controlled trial (RCT) Participants are asked to discuss the different designs and name advantages and disadvantages of each.

Evaluation Designs Points to consider: How many groups of participants need to be surveyed and how many times? (e.g. A post only design with no comparison group only surveys one group once) What are the implications for your time and resources, participants’ responses, and drop-out rates? Any base-line measurement taken? You need a base- line to be able to measure change Can you know if it was the intervention that caused the change, or factors external to the intervention? e.g. age/maturation, other interventions in the area, or random chance (such as someone close to the participant being involved in a road traffic incident). Participants are asked to discuss the different designs and name advantages and disadvantages of each.

Methods in our Madness! You want to show effectiveness and have the opportunity to do a quasi experiment (e.g. Pre- and post with comparison group) Choose a method suitable for the intervention you have been working on, e.g. Telephone interview or self-complete questionnaire Write 5 questions. At least one question should be open ended.

Question Writing Basics Keep questions short Do not ask two questions in one, e.g. ‘How enjoyable and informative did you find this workshop?’ Be sure that everyone will understand the question in the same way – pre-test questions Avoid jargon and abbreviations – do not assume people will know what something means just because you do, e.g. ‘serious injury’, or ‘ADI’ Aim to wind up by 3.30. I am now pleased to hand over to Robert, who will introduce the first session: ‘The 39 steps – to successful evaluation’

Question Writing Basics Avoid leading questions, e.g. ‘Do you agree this workshop was enjoyable?’ as opposed to: ‘Please rate how enjoyable you found this workshop’ Avoid using two negatives in one question, e.g. ‘How much do you agree with the following statement: I never not wear a seat belt’ Avoid surplus questions. Do you really need to ask that question? How is it different to your other questions? How will you use the data? Aim to wind up by 3.30. I am now pleased to hand over to Robert, who will introduce the first session: ‘The 39 steps – to successful evaluation’

Question Writing Basics Be Specific about what you are asking: ‘Please rate this course on a scale of 1-5 with 1 meaning Poor and 5 meaning Excellent’ ‘Please rate the following aspects of this course on a scale of 1-5 with 1 meaning Poor and 5 meaning Excellent’ (Opportunity to ask questions, Knowledge of presenters, Use of Examples...) Aim to wind up by 3.30. I am now pleased to hand over to Robert, who will introduce the first session: ‘The 39 steps – to successful evaluation’

Training Day Survey Take a look at our own training day evaluation survey. What are the advantages and disadvantages of this design and method? Please complete the survey before you leave if possible.

We’re here to help rneedham@rospa.com

Thank you Thank you. Contact details should you wish to contact me.