Using empowerment evaluation to improve community-based programs Dr June Lennie AES lunchtime seminar Brisbane, 14 September 2005.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

WV High Quality Standards for Schools
Leicestershires Vision for short break transformation Leicestershire is committed to the transformation and expansion of short break services for disabled.
PQF Induction: Small group delivery or 1-1 session.
Building capacity for assessment leadership via professional development and mentoring of course coordinators Merrilyn Goos.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Curriculum, Instruction, & Assessment
Opportunities and risks: Recent research on shared services in the community sector Dr June Lennie June Lennie Research and Evaluation
Standards and Guidelines for Quality Assurance in the European
February 8, 2012 Session 4: Educational Leadership Policy Standards 1 Council of Chief School Officers April 2008.
Implementing the new Australian Medical Council standards: The focus on Indigenous health Professor Michael Field Chair, Medical School Accreditation Committee,
Presented by Margaret Shandorf
Models for a cross agency rural Allied Health workforce Richard Cheney, Delys Brady, Graeme Kershaw, Linda Cutler, Jenny Preece.
Introduction to Standard 2: Partnering with consumers Advice Centre Network Meeting Nicola Dunbar October 2012.
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
Education Bachelor’s Degree in Elementary Education Began the Master’s of Special Education program in January of 2011 Professional After graduation Sorensen.
Action-oriented health education in the context of Kenyan primary schools W. Onyango-Ouma, PhD Senior Research Fellow Institute of Anthropology and Gender.
Meeting SB 290 District Evaluation Requirements
Strategic Planning. Definitions & Concepts Planning: is a scientific approach for decision making. Planning: is a scientific approach for decision making.
CHCCD412A Cluster 1.  s/pdf_file/0006/54888/CHAPS_Community- Services-Pathway-Flyer_v 4.pdf
Dawne Gurbutt, Discipline Lead, Health Related Studies 11 th July 2013 Enhancing the student learning experience through Patient & Public Involvement Practice,
ACJRD 16 th Annual Conference 4 th October  2007: Prevention and Early Intervention Programme, funded by DYCA and The Atlantic Philanthropies;
Beyond Primary Education: Challenges of and Approaches to Expanding Learning Opportunities in AfricaAssociation for the Development of Education in Africa.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
Thomas College Name Major Expected date of graduation address
‘PARENT’S IN PARTNERSHIP’ Carol Cuffe Disability Manager Kildare West Wicklow.
Improving relevant standards. Aims and objectives Familiarize ourselves with best practice standards of teaching To think about how we can implement the.
System Implementation and Monitoring Regional Session Spring, 2014 Resources are available at sim.abel.yorku.ca.
E MPOWERMENT E VALUATION December 9, 2011 EPS 654: Program Evaluation Katherine Coder.
AdvancED TM External Review Exit Report Polk Pre-Collegiate Academy April 16– 17, 2014.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Quality Assurance Review Team Oral Exit Report District Accreditation Bibb County Schools February 5-8, 2012.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Overview What do we mean by a Learning Organisation? Why did we develop a People Development Framework? What was the process involved in building the.
What should we expect from 2 nd year students? A realistic approach. Tuesday 25 th February 2014 Tutor CPD Seminar.
Quality Assurance Review Team Oral Exit Report District Accreditation Rapides Parish School District February 2, 2011.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
July 2007 National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Role of Action Planning in The Developmental.
Quality Assurance Review Team Oral Exit Report School Accreditation AUTEC School 4-8 March 2012.
Quality Assurance Review Team Oral Exit Report School Accreditation Center Grove High School 10 November 2010.
Enter System Name AdvancED TM External Review Exit Report Maxey Elementary December 9 and 10, 2013.
Strategic Planning Crossing the ICT Bridge Project Trainers: Lynne Gibb Sally Dusting-Laird.
The European Expert Group on the Transition from Institutional to Community-based Care Claire Champeix, Coordinator European Expert Group on the Transition.
Innovative Schools toolkit STRATEGIC WORKSHOP 2 Exploring good practice case studies.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Development Team Day 5a October Aim To explore approaches to evaluating the impact of the curriculum on pupil learning.
Scottish Improvement Science Collaborating Centre Strengthening the evidence base for improvement science: lessons learned Dr Nicola Gray, Senior Lecturer,
Welcoming, caring, respectful, and safe learning and working environments and student code of conduct A presentation for EIPS leadership, COSC, EIPS staff,
European Social Fund Promoting improvement 15 th March 2016 Nigel Finch.
Understanding Primary Music Self-study materials: Overcoming barriers to learning.
CHB Conference 2007 Planning for and Promoting Healthy Communities Roles and Responsibilities of Community Health Boards Presented by Carla Anglehart Director,
A lens to ensure each student successfully completes their educational program in Prince Rupert with a sense of hope, purpose, and control.
Evaluation for Social Justice AMY HILGENDORF, PHD KATE WESTABY, MS VICTORIA FAUST, MPA UNIVERSITY OF WISCONSIN-MADISON American Evaluation Association.
Clerks’ Annual Conference 2010 Clerking towards an “Outstanding” Governing Body Steve Telfer Leadership & Governance.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Building and Measuring Community Empowerment
Outline of Quality assurance and accreditation
Advancing Social Justice
Clinical Practice evaluations and Performance Review
Department of Myanmar Education Research
school self-evaluation and improvement toolkit
Peter Njaramba, Mary Pat Selvaggio and Josie Mangxaba
Parent-Teacher Partnerships for Student Success
Standard for Teachers’ Professional Development July 2016
February 21-22, 2018.
Experienced Headteacher Development Programme
Presentation transcript:

Using empowerment evaluation to improve community-based programs Dr June Lennie AES lunchtime seminar Brisbane, 14 September 2005

Outline of seminar Overview of empowerment evaluation methodology and principles The three steps in empowerment evaluation Evaluation of the Good Start Breakfast Club program Strengths and limitations of this approach

Background Increasing use of participatory and collaborative approaches to evaluation Value of these approaches in evaluating complex community-based programs Empowerment evaluation distinguished by its principles Requires commitment to democratic participation, inclusion and other key principles

Definition of empowerment evaluation An evaluation approach that aims to increase the probability of achieving program success by (1) providing program stakeholders with tools for assessing the planning, implementation, and self-evaluation of their program, and (2) mainstreaming evaluation as part of the planning and management of the program/organisation. (Wandersman et al, 2005, p. 28)

Development of empowerment evaluation Developed by David Fetterman at Stanford University in 1994 as a three step approach: 1. Developing a mission and vision 2. Taking Stock 3. Planning for the Future Wandersman and others developed a 10 step approach called ‘Getting to Outcomes’

Uses of empowerment evaluation EE has been successfully used to: help make a children’s hospital more family-centred evaluate a school-based reading improvement program institutionalise evaluation as part of a higher education accreditation self-study develop and assess a $15 million Digital Village project foster organisational learning in a child abuse prevention collaborative

Ten underlying principles of empowerment evaluation 1. Improvement: A key aim is to improve people, programs, organisations and communities and to help them achieve results. 2. Community ownership: Program stakeholders, with the assistance of evaluators, take responsibility for designing and conducting the evaluation and putting the findings to use. 3. Inclusion: Participants, staff from all levels of a program or organisation, funders, and community members are invited to participate.

EE principles (cont.) 4. Democratic participation: Active participation by all in shared decision making is valued; processes are based on deliberation, action and authentic collaboration. 5. Social justice: High value placed on addressing the larger social good of practices and programs and achieving a more equitable society. Seen as a means to help people address inequities through capacity building. 6. Community knowledge: Community-based knowledge, information and experience is respected and used to make decisions, understand the local context and interpret results.

EE principles (cont.) 7. Evidence-based strategies: Value placed on providing empirical justifications for action and drawing on other evidence-based strategies that have worked. However, they need to be adapted to the local environment, culture and conditions. 8. Capacity-building: Program staff and participants learn how to conduct their own evaluations. All people and organisations are seen as capable of conducting evaluations when provided with the appropriate tools and conditions.

EE principles (cont.) 9. Organisational learning: EE helps create a community of learners. Continually reflecting on and evaluating programs and organisations makes groups or organisations more responsive to changes and challenges. Evaluation results are used to guide improvement. 10. Accountability: Individuals and organisations are held accountable for commitments made. Funders are held accountable concerning their expectations. A commitment is made to results- based interventions and continuous improvement.

Roles of the professional evaluator Facilitator, critical friend, coach, teacher and evaluation expert Supports the purpose of the program, wants it to succeed (ie not impartial) Helps participants develop a rigorous and organised approach to evaluation and clarify their theories of change Helps establish baseline data, monitor interventions and document change over time Ensures everyone has an opportunity to speak

Three steps in an empowerment evaluation Step 1: Developing a mission and vision Key phrases are developed that capture the mission and vision of the program This is done even when an existing mission and vision statement exists Allows new ideas and divergent views about the program or organisation to emerge Consensus is reached on the statements which represent the values of the group

School of Education mission statement phrases

Three steps in an empowerment evaluation (cont.) Step 2: Taking Stock Part 1: Brainstorm list of 10 – 20 activities crucial to functioning of program. Voting process used to prioritise the list and identify the 10 most important activities to evaluate Part 2: Rate how well the 10 key activities are doing on a 1-10 scale; discuss ratings. Discussion provides evidence to support ratings. Also provides baseline data on program and its strengths and weaknesses.

Example of prioritisation exercise from Part 1 of Taking Stock

Example of ratings matrix from Part 2 of Taking Stock

Three steps in an empowerment evaluation (cont.) Step 3: Planning for the Future Part 1: Brainstorm realistic goals for each key program activity Part 2: Develop strategies to help reach these goals Part 3: Identify forms of documentation or evidence that will monitor progress towards goals (ie surveys, checklists, minutes of meetings etc, development of website etc.) A series of meetings and workshops are then held to plan and implement the evaluation in more detail

Example of Planning for the future exercise: Children’s hospital project Key activity: Training staff GoalsStrategiesEvidence Consensus as to what Family Centred Care is Include medical staff Complete training curriculum Formal training pilot Identify critical people (medical staff) Ongoing training Grouped by discipline or mutidisciplinary? Experimental studies Attendance checklists for training Pre and post test using Likert scale Training materials

Use of EE to evaluate the Good Start Breakfast Club program Background: A school Breakfast Club was established in NSW by Australian Red Cross in 1991 The Sanitarium Health Food company has supported the Club since 2001 GSBC program launched in December 2003, following increased support from Sanitarium

Good Start Breakfast Club program Clubs in nearly 100 primary schools in New South Wales, Victoria, South Australia, Queensland, Tasmania and the Northern Territory Involves over 1,200 volunteers who deliver over 140,000 breakfasts to children in need each year Anecdotal evidence on program impacts include that it increases student’s ability to learn, and improves their social and behavioural skills and nutritional knowledge

Background to evaluation of GSBC program No evaluation had previously been conducted to determine the nutritional, social and educational impacts of the program on children Graduate School of Public Health, University of Wollongong approached by Sanitarium and the Adventist Development Relief Agency to develop a research methodology for use in the Breakfast Club

Project to develop practical methods to evaluate breakfast programs ARC Linkage grant obtained by UOW in 2003 for a PhD scholarship and some research costs for the project ‘Practical methods to evaluate school breakfast programs’ Following a literature review, EE was selected to evaluate the program - had demonstrated strengths and simplicity, was congruent with values and objectives of GSBC program A case study of the impacts of the EE approach on the delivery of the program will be developed An expected outcome of the project is the development of a practical and useful GSBC Evaluation Toolkit

To date, the evaluation has involved: 1. Questionnaires distributed to volunteers and teaching staff in most program areas to enable input into the EE process 2. EE workshops with GSBC coordinators and managers from around Australia held over two days in May Two full day EE workshops with GSBC volunteers and teaching staff from schools in Sydney and Western NSW in July Feedback questionnaires on EE methododology and workshops. Data also collected on gender, age, role in program, location and prior knowledge of participatory evaluation

Outcomes so far Development of proposed new mission and vision statements Identification of key activities that will be evaluated (eg. ‘volunteer management and support’; ‘providing breakfast to children in need’) Individual and average ratings for each key activity and reasons for giving these ratings For each key activity, a list of goals, strategies to meet these goals, and suggested forms of evidence to monitor progress towards goals

Feedback on methodology and methods Most thought the methodology was valuable Opportunity to share information, experiences and ideas appreciated Knowledge and understanding of participatory program evaluation increased More time needed at some workshops for discussion on program’s strengths and weaknesses and Planning for the Future step

Strengths of empowerment evaluation include: Robust process; demonstrated effectiveness in improving community-based programs Builds evaluation capacities and a culture of evaluation based on continuous improvement and learning Evaluation designed and controlled by program staff, participants and stakeholders Process is participatory and inclusive Methods aim to be democratic and empowering Enables the ongoing collection of reliable, evidence- based data Can provide more open and honest assessments of a program’s strengths and weaknesses

Limitations and issues include: Evaluators need a wide range of skills, including facilitation and training, and knowledge of program evaluation Requires careful management of power relations and differing agendas, values and perspectives Process requires time and resources that are not always available Involvement of volunteers can be problematic

Conclusion EE has many strengths that makes it valuable for improving community-based programs and increasing their sustainability However, a number of issues need to be considered, such as the time and resources required to build capacities and include a diversity of people in planning and conducting the evaluation To be effective, requires a commitment to its principles by senior management, staff and participants

Some resources on empowerment evaluation Empowerment Evaluation website: Fetterman, D. (2001). Foundations of Empowerment Evaluation. Thousand Oaks, California: Sage. (includes details of the three step empowerment evaluation process) Fetterman, D. and Wandersman, A. (eds). (2005) Empowerment Evaluation. Principles in Practice. New York: The Guilford Press (includes a critique of the method) Paper by Wayne Miller and June Lennie: ‘Empowerment evaluation: A practical method for evaluating a school breakfast program’. To be presented at the AES Conference, October 2005