ROAD SAFETY ETP EVALUATION TRAINING www.roadsafetyevaluation.com.

Slides:



Advertisements
Similar presentations
Module N° 7 – SSP training programme
Advertisements

Managing SEN: Monitoring and Evaluation – Gathering Evidence: Making Judgements Day 1.
Introduction to Impact Assessment
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Promoting Rational Drug Use in the Community Monitoring and evaluation.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
CAST Project funded by the European Commission, Directorate-General Energy & Transport, under the 6th RTD Framework Programme CAST Plenary Meeting 30 September.
ETP EVALUATION SCOTLAND WORKSHOPS 16 th MARCH 2011 and 17 th MARCH 2011.
Benefits and limits of randomization 2.4. Tailoring the evaluation to the question Advantage: answer the specific question well – We design our evaluation.
Logic Models. A logic model is your program ROAD MAP. Where are you trying to go? How are you trying to get there? What will tell you that you’ve arrived?
Attitudes an introduction ist=PL03B96EBEDD01E386.
Quality of Life ENACTUS TRAINING Measurement Tools Developed by D Caspersz & D Bejr, 2013.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Educational Solutions for Workforce Development PILOT WORKSHOP EVALUATION MARY RICHARDSON MER CONSULTING.
Measuring for Success Module Nine Instructions:
How to Develop the Right Research Questions for Program Evaluation
Effectively applying ISO9001:2000 clauses 5 and 8
Unit 2: Managing the development of self and others Life Science and Chemical Science Professionals Higher Apprenticeships Unit 2 Managing the development.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Module 1 Introduction to SRL. Aims of the Masterclass Understand the principles of self regulated learning (SRL) and how they apply to GP training Develop.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
Key Performance Measures, Evaluation Plans, and Work Plan
26 TH ACADEMIC COUNCIL ST APRIL 2015 Breakout session Group C “EXPERIENTIAL LEARNING”
1 Classroom management and partnerships Working with other adults in the class.
The Evaluation Plan.
Program Evaluation and Logic Models
Educational Solutions for Workforce Development Unit 8 – How to Evaluate Aim To provide an overview of how to effectively evaluate learning programmes.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Workshop 6 - How do you measure Outcomes?
The Impact of Health Coaching
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Evaluating and measuring impact in career development: extension workshop Presented by – Date – Just to identify strengths and areas to improve are no.
PRESENTATION ON ACTION PLANS Sania Sultana & Mariam Mir DAMHS PHASE – IV.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
ETP Aims and Goals “My first road safety song reduced road fatalities in children by 14.3% in the first three months.” (Anonymous.
CAST Project funded by the European Commission, Directorate-General Energy & Transport, under the 6th RTD Framework Programme.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
TESOL Materials Design and Development Week 5: Workshop & Lecture on Student Learning Objectives (SLOs) and using Language Analysis in Lesson Planning.
Regional Educational Laboratory at EDC relnei.org Logic Models to Support Program Design, Implementation and Evaluation Sheila Rodriguez Education Development.
Teaching Reading Comprehension
1 Nemours Health Planning and Evaluation Collaborative Learning Session I: Designing Effective Health Programs.
Planning Learning Schemes of Work Programmes of Study Training Packages Getting the Paperwork Right (and the thinking behind the paperwork)
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Learning Outcomes What are learning outcomes? What are learning outcomes? They tell the learner what they should be able to do at the end of the their.
BREAKOUT 1: Identifying the Gap (or Journey) (13.45 – 15.00)
Culture and ValuesA Whole School Approach High Expectations Understanding Barriers Monitoring and Evaluation Accountability Targeted Activity.
O3 Incidents and Collisions.
TeamSTEPPS for Office-Based Care Implementation Planning.
Developing classroom practice to raise standards Rosemarie Sadler, Primary Performance Adviser Val Phillips – Primary Performance Adviser.
Building an ENI CBC project
Evaluating the Quality and Impact of Community Benefit Programs
Welcome to Scottish Improvement Skills
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Road Safety ETP Evaluation Training
Refreshing your management skills programme - Reflective Practice
Controlling Measuring Quality of Patient Care
Target Setting for Student Progress
Chapter 16 Planning and Management of Health Promotion
Rakhymzhanova Bayan, Center of Excellence, Kazakhstan
Monitoring and Evaluation
Lesson Planning (2) (A.E.T. Wk 11).
Presentation transcript:

ROAD SAFETY ETP EVALUATION TRAINING

AIM OF THE DAY To develop participants’ ability to carry out ETP evaluation

Intended Learning Outcomes  Be familiar with the meaning of specific terms used in evaluation  Be able to distinguish between goals, aims and objectives  Be able to set appropriate objectives  Be able to construct and use a logic model  Understand the pros and cons of different evaluation designs  Be able to write closed and open questions

Glossary Treasure Hunt! Remember that pre-seminar task? Check your answers with the definitions displayed around the room Make sure you introduce yourself to anyone you don’t already know!

The Evaluation Process Planning and Evaluating an intervention – key steps What comes to mind?

7 Key Steps in Planning and Evaluating an Intervention 1)Needs Analysis – 1st data collection stage: What exactly is the problem and What is the best way to solve it? 2)Intervention Planning – What are we going to do and how? (aims and objectives) 3)Monitoring and Evaluation Framework – What, When and How to measure? 2 nd data collection stage: base-line data

7 Key Steps in Planning and Evaluating an Intervention 4) Implement Intervention – 3 rd data collection stage: collect monitoring data 5) Main data collection - Collect ‘post-intervention’ data. Review the data collected – have what you need? Analyse data 6) Report – Write report and disseminate findings 7) USE – Apply findings: amend intervention as necessary. Use in needs analysis stage (Step 1) of future interventions

The Intervention-Evaluation Cycle Identify need, target group & likely ‘best’ intervention (local data, strategy, MAST, previous evaluations) Plan project and evaluation: Set aims & objectives, decide eval. design and methods Publish and Share results Evaluate processes and/or outcomes Deliver intervention Feed results and lessons learned into future projects

What is the Difference...  Between an Aim and an Objective?  Between a Goal and an Aim?

Definitions: a refresher Goal (Very broad): The overall reason behind the intervention – e.g. To reduce road casualties Aim(s) (Specific): What and who the intervention will change – e.g. To reduce the incidence of drink-driving amongst year olds Objective(s) (Very specific): Specific outcome the intervention intends to achieve - e.g.  To see an increase of 20% in the knowledge of year olds about the legal penalties for drink-driving, by September 2011  To reduce mean scores in a self-report survey of drink- driving behaviour completed by year olds, by November 2011 (SMART)

Example: ‘ Older drivers project’ Aims: To increase self awareness of older drivers To improve driving skills of drivers over the age of 55

Older drivers project Objectives (By March 2012):  To increase older drivers’ knowledge of the effects of medication on driving ability  To improve older drivers’ journey planning  To increase self identification of new aspects of their driving which could be improved  To increase older drivers’ skills scores in a repeat test drive

Logic model Components Inputs Outputs Outcomes Assumptions External factors Aims and objectives

EXAMPLE LOGIC MODEL INPUTS STAFF MONEY ADIS RESEARCH VENUE FOOD/DRINK OUTPUTS NUMBER OF EVENTS HELD NUMBER OF OLDER DRIVERS ATTENDING TEST DRIVES INCREASE KNOWLEDGE ABOUT EFFECTS OF MEDICATION INCREASE JOURNEY PLANNING SKILLS S.T.OUTCOMESINTERMED. SKILLS PUT INTO PRACTICE DRIVERS SELF IDENTIFY NEW AREAS FOR IMPROVEMENT LONG OLDER DRIVERS CONTINUE TO REVIEW AND REFLECT ON THE AREAS FOR IMPROVEMENT IMPROVED SKILLS OBSERVED ASSUMPTIONS: Drivers accept the ADI assessments of their skills. Drivers identify themselves as an ‘older’ driver and see a need to attend the sessions/assessments. That the sessions and assessments will have a positive influence, i.e. Not make drivers unduly cautious or over-confident. EXTERNAL FACTORS: Adverse weather, local news stories involving older drivers, increased insurance premiums, family members, poor health.

Evaluation Designs Post only with no comparison group Post only with comparison group Post then pre, no comparison group Pre and post with no comparison group (non experiment) Pre and post with comparison group (quasi experiment) Randomised controlled trial (RCT)

Evaluation Designs Points to consider:  How many groups of participants need to be surveyed and how many times? (e.g. A post only design with no comparison group only surveys one group once)  What are the implications for your time and resources, participants’ responses, and drop-out rates?  Any base-line measurement taken? You need a base- line to be able to measure change

Evaluation Designs More points to consider: Can you know if it was the intervention that caused the change, or factors external to the intervention? e.g. age/maturation, other interventions in the area, or random chance (such as someone close to the participant being involved in a road traffic incident).

Methods in our madness!! You want to show effectiveness and have the opportunity to do a quasi experiment (e.g. Pre- and post with comparison group) Choose a method suitable for the intervention you have been working on, e.g. Telephone interview or self-complete questionnaire Write 5 questions. At least one question should be open ended.

Question Writing Basics  Keep questions short  Do not ask two questions in one, e.g. ‘How enjoyable and informative did you find this workshop?’  Be sure that everyone will understand the question in the same way – pre-test questions  Avoid jargon and abbreviations – do not assume people will know what something means just because you do, e.g. ‘serious injury’, or ‘ADI’

Question Writing Basics  Avoid leading questions, e.g. ‘Do you agree this workshop was enjoyable?’ as opposed to: ‘Please rate how enjoyable you found this workshop’  Avoid using two negatives in one question, e.g. ‘How much do you agree with the following statement: I never not wear a seat belt’  Avoid surplus questions. Do you really need to ask that question? How is it different to your other questions? How will you use the data?

Question Writing Basics Be Specific about what you are asking: ‘Please rate this course on a scale of 1-5 with 1 meaning Poor and 5 meaning Excellent’ ‘Please rate the following aspects of this course on a scale of 1-5 with 1 meaning Poor and 5 meaning Excellent’ (Opportunity to ask questions, Knowledge of presenters, Use of Examples...)

Training Day Survey Take a look at our own training day evaluation survey. What are the advantages and disadvantages of this design and method? Please complete the survey before you leave if possible.

We’re here to help Thank you!