Case Studies – Australia Ross Attrill – International IDEA.

Slides:



Advertisements
Similar presentations
Leicestershires Vision for short break transformation Leicestershire is committed to the transformation and expansion of short break services for disabled.
Advertisements

Customised training: Learner Voice and Post-16 Citizenship.
Conceptual Feedback Threading Staff Development. Goals of Presentation What is Conceptualized Feedback? How is it used to thread the development of staff?
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
EUROPEAN STUDENTS’ FORUM HOW TO MAKE A WORKSHOP ON... EVERY EU RELATED TOPIC Y VOTE 2014.
Standard 6: Clinical Handover
Rhianna R. Andrews, MSW, Whitney L. Benakis MSW, Anissa T. Cox, MED Association University Center of Disabilities, University of Missouri, Columbia, MO.
PBL Post-Project Review. 1. Student Engagement2. Project Idea3. Student Learning4. Authenticity of Project Tasks and Products5. Quality and Use of Driving.
 Reading School Committee January 23,
Skills for Routines Breakout Sessions Breakout session 1: Preparing for a routine: Self-assessment and calibration.
The Quality Challenge: The Early Years Strategy Nóirín Hayes Centre for Social and Educational Research
Second Legislated Review of Community Treatment Orders Ministry of Health and Long-Term Care November 9, 2012.
Consistency of Assessment
Return On Investment Integrated Monitoring and Evaluation Framework.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Action Implementation and Monitoring A risk in PHN practice is that so much attention can be devoted to development of objectives and planning to address.
Investigations in Number, Data, and Space: Teaching Philosophy.
Formative and Summative Evaluations
Quality evaluation and improvement for Internal Audit
Evaluation. Practical Evaluation Michael Quinn Patton.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Quality of Life ENACTUS TRAINING Measurement Tools Developed by D Caspersz & D Bejr, 2013.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Penny Worland, Senior Policy Planner District Council of Mount Barker Feb 2015.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Classroom Action Research Overview What is Action Research? What do Teacher Researchers Do? Guidelines and Ideas for Research.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
RAISING YOUNG PEOPLES’ ASPIRATIONS DENISE McLELLAN CHIEF EXECUTIVE NHS WALSALL WALSALL PARTNERSHIP CONSULTATION EVENT 8 FEBRUARY 2010.
Two Strategies for Developing Solid Referral Relationships A Complete Training Series.
Needs Analysis Session Scottish Community Development Centre November 2007.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Data and Data Collection Questionnaire
Interstate New Teacher Assessment and Support Consortium (INTASC)
Perioperative fasting guideline Getting it into practice Getting started.
Engaging effectively with industry to tailor learning and assessment to work place needs Rosemary Condon Director of Work Place Training and RTO Consultancy.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
1 Focus on Quality and the Academic Quality Improvement Program At Cuyahoga Community College.
Military Family Services Program Participant Survey Training Presentation.
Welcome! Please join us via teleconference: Phone: Code:
Evaluation Tools & Approaches for Engaging Stakeholders: The AREA Approach Presentation at the Canadian Evaluation Society Dr. Mariam Manley, Principal.
1 Department of Medical Assistance Services Stakeholder Advisory Committee June 25, 2014 Gerald A. Craver, PhD
Putting Research to Work in K-8 Science Classrooms Ready, Set, SCIENCE.
Module 5: Data Collection. This training session contains information regarding: Audit Cycle Begins Audit Cycle Begins Questionnaire Administration Questionnaire.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
111 Synthesis of Questionnaires. Thematic concentration  Most of the new member states support the suggested principle while maintaining the element.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
The Interactive Model Of Program Planning
Better Community Engagement Training for Trainers Course Day 1 This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Designing a Training Program RATIONALE OF THE TRAINING Background or introduction of what the training is all about –Developments in the field/discipline/area.
Kabell Konsulting ApS - presentation WBI BBL Leadership for Results Evaluating Leadership Interventions Kabell Konsulting ApS, Dorte Kabell.
Sydneytafe.edu.aureal skills, endless possibilities LEADERSHIP DEVELOPMENT PROGRAM July 2014.
ICount A 3-Part Lesson to Engage Learners in the Voting Process October 26 th, 2013.
Learning and Teaching Strategies for the Health Management and Social Care Curriculum Series: (6) Effective Learning and Teaching Strategies for Field.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
The purpose of evaluation is not to prove, but to improve.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Final Presentation, European Cooperative House Brussels, 16 Dec.2009 Training +45 “Teachers Facilitating Learning among Elders” WP5 Test and Evaluation.
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
1 The other 80% of Learning in Government –Informal Learning.
European Social Fund Promoting improvement 15 th March 2016 Nigel Finch.
MATHEMATICS KLA Years 1 to 10 Planning MATHEMATICS Years 1 to 10.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Civics & Citizenship Education ‘Voting in the classroom’ Megan McCrone Senior Education Officer.
Working Together for Our Future Wellbeing: Framework for Working with the Third Sector Anne Wei Strategic Partnership and Planning Manager Cardiff and.
Role of the Internal Verifier
Presentation transcript:

Case Studies – Australia Ross Attrill – International IDEA

Voter Education Evaluation The electoral educators all seemed very passionate about the importance of what they were delivering. However, they were less passionate about the evaluation aspect of their role. There did not appear to be consistent support for evaluation amongst all electoral educators and there did not appear to be any consistently applied approach to evaluation

The Case Studies Evaluation Methodology for Electoral Education Programs Youth Election Survey

Case Study 1 - Evaluation Methodology for Electoral Education Programs Electoral Education Context - Three Main Streams Electoral Education Centres (EECs) School and Community Visits Program(SCVP) Teacher Training - “Your Vote Counts” (YVC)

Case Study 1 - Evaluation Methodology for Electoral Education Programs Choose Clear Objectives for the Program Participants Should: Understand the role of the AEC; Understand the concept of representation in a democracy; Be aware of Compulsory enrolment; Be aware of Compulsory voting; Understand Preferential voting (alternative vote); and Understand the concept of Formality (Validity)

Case Study 1 - Evaluation Methodology for Electoral Education Programs How the Results Will be Used: Review and update the content of the education sessions. Measure the degree of participant/customer satisfaction with the AEC program. Assess the appropriateness of the delivery and content of the AEC education sessions for all audiences. Provide data and information for inclusion in various executive reports and the AEC ’ s Annual report. Inform the development of business plans.

Case Study 1 - Evaluation Methodology for Electoral Education Programs Performance Indicators 1.Participant feedback that indicates improved knowledge and increased understanding of electoral matters. Target 95% for SCVP and YVC sessions, maintaining or exceeding previous years' results for EEC sessions. 2.Audience satisfaction with the education program. Target 95% for SCVP and YVC, high level of audience satisfaction for EEC sessions. 3. Percentage of 17 and 18 year old participants in EEC sessions who are more likely to vote at the next election. Target 75%.

Case Study 1 - Evaluation Methodology for Electoral Education Programs Pre - session Evaluations Baseline Data The ideal time to measure pre program knowledge against the key program objectives is immediately prior to participation in the program.

Case Study 1 - Evaluation Methodology for Electoral Education Programs Post-Session Evaluation - Timing What is the best time to conduct the evaluation? immediate, post session follow up review questionnaire at a much later date - Teachers

Case Study 1 - Evaluation Methodology for Electoral Education Programs How much is enough? Number of evaluation responses should ensure the results obtained are robust. Suggestion that a minimum of 400 evaluations per year in each EEC and in the SCVP

Case Study 1 - Evaluation Methodology for Electoral Education Programs Questionnaire Design short and ideally completed in 5 minutes or less. record essential demographic information measure attendee knowledge against the key program objectives. test attendee knowledge against the key program objectives rather than ask them to self-assess. tailored for each target audience

Case Study 2 - Youth Electoral Study (YES) 2004 Rationale To investigate reasons for youth disengagement with political process and institutions To provide data on which to base a revised Youth Voter Education program

Case Study 2 - Youth Electoral Study (YES) 2004 Areas Investigated The influence of family on Engagement The influence of school on Engagement The influence of Political Parties on Engagement The influence of Political Knowledge on Engagement

Case Study 2 - Youth Electoral Study (YES) 2004 Methodologies Review of existing literature Case Studies - based on in depth group interviews National School Survey schools, 4600 students

Case Study 2 - Youth Electoral Study (YES) 2004 Focus Questions What sorts of political actions do you take part in? Rank voting against other events in terms of excitement Rank the effect of your family on your political participation? Do you feel like you have enough knowledge to participate in political processes

Case Study 2 - Youth Electoral Study (YES) 2004 Outcome Australian Electoral Commission Youth Communication Strategy

Conclusions Each evaluation methodology must be appropriate to what is being evaluated Each will use different approaches in order to capture the data needed In general, there are some rules that should be followed: Choose clear program objectives Decide how the results will be used Choose challenging but achievable performance indicators Get the timing right Collect baseline data Design your survey tools carefully Ensure that the process of data collection is as painless as possible for everyone

Thank you and good luck!