E MPOWERMENT E VALUATION December 9, 2011 EPS 654: Program Evaluation Katherine Coder.

Slides:



Advertisements
Similar presentations
Child Rights Toolkit Comprehensive Toolkit To Address Children's Rights In Development & Humanitarian Cooperation And Government Programming.
Advertisements

Health Promotion.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Social Development: Proposed Strategic Directions for the World Bank
CAIM Start up Workshop Knowledge Management Ankita Handoo, IFAD India.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
NOMS Grants Programme 2014/15
EETAP UPDATE Presented By Dr. Augusto Medina University of Wisconsin-Stevens Point Welcome.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
1 Getting Equity Advocacy Results (GEAR) identifying and tracking the essential components of equity advocacy for policy change Knowledge for Equity Conference.
IFAD Reform towards a better development effectiveness How can we all do better? Mohamed Béavogui Director, West and Central Africa January 2009.
1 Ben George – Poet, Al Zantua & David Little Raven – Drummers.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
Certified Business Process Professional (CBPP®)
INTRODUCTION Performance management is a relatively new concept to the field of management.
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
A big picture of the curriculum Adapted with thanks to colleagues at the Council for Curriculum, Examinations and Assessment (CCEA) Working draft: With.
5. How to Amass Evidence (Evaluation) of Change and its Effects? How does assessment drive transformative change in the classroom, at the department level,
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
TUSA06 Strengthening Participatory Approaches in HIV Prevention with Vulnerable Youth: Exchange of Good Practices, Improvement of Quality and Development.
Participatory Evaluation Mary Phillips, BME Former Circles of Care Program Coordinator, Oakland and an Evaluator, Los Angeles, CA.
Theme III Introducing Greater Impact Orientation at the Institutional Level Group 6.
A big picture for Outstanding Citizenship. Three key questions 3 How well are we achieving our aims? 1 What are we trying to achieve? 2 How do we organise.
Using Implementation Research to Inform Technical Assistance Practice Sam Morgan Peggy Malloy
Instructional leadership: The role of promoting teaching and learning EMASA Conference 2011 Presentation Mathakga Botha Wits school of Education.
BUILDING A COMMUNITY OF PRACTICE. Question 1: What is the mission of our network? To share knowledge and experiences. To extract lessons to improve dialogue.
Measuring the Impact WP5 & WP9 ECHO Utrecht / KinderUni Wien Hendrik Asper (ECHO)
Human Resource Management Lecture 27 MGT 350. Last Lecture What is change. why do we require change. You have to be comfortable with the change before.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Skills Online: Building Practitioner Competence in an Inter-professional, Virtual Classroom Canadian Public Health Association 2008 Annual Conference.
Innovative Schools toolkit Strategic Workshop 3 - Exploring good practice case studies.
Developing Indicators
1 John Whitesides Alaska Native Tribal Health Consortium.
Defining and Directing Public Administration toward Performance Excellence Dr. Donald Klingner International Conference on Administrative Development Riyadh,
Using empowerment evaluation to improve community-based programs Dr June Lennie AES lunchtime seminar Brisbane, 14 September 2005.
Speaking for Myself 2009 Child Participation Call – Investing in People.
World summit on the information society Comments on the Visions & Principles of “ Information Society ” Takuo Imagawa, Osaka.
Transforming Elementary Education Management : a perspective on institutional development Dr Pramila Menon NUEPA, New Delhi.
TELECENTRE EUROPE ACTIVITY OVERVIEW LAURENTIU BUNESCU Grants and Campaigns Manager Telecentre Europe Szeged, 27 th Nov 2014.
T he Istanbul Principles and the International Framework Geneva, Switzerland June 2013.
Entrepreneurial Incentives and Venture Philanthropy: Not your Grandmother’s Benevolence! Presented by Allyson Reaves July 11, 2008 International Society.
BCO Impact Assessment Component 3 Scoping Study David Souter.
Region 1 Training Workshop Crowne Plaza Albany – 1-2 August 2008 Session 1A Strategic Planning Arthur W. Winston Chair, R1 Strategic Planning Committee.
Module V: Writing Your Sustainability Plan Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 © 2011.
+ NASP’s Position Statement on Prevention and Intervention Research in the Schools Training School Psychologists to be Experts in Evidence Based Practices.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Kathy Corbiere Service Delivery and Performance Commission
Innovative Schools toolkit STRATEGIC WORKSHOP 2 Exploring good practice case studies.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
Transforming lives through learning CLD Annual Conference: October 29 th Putting our ambitions for community development into practice An overview of the.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Scottish Improvement Science Collaborating Centre Strengthening the evidence base for improvement science: lessons learned Dr Nicola Gray, Senior Lecturer,
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Evaluation for Social Justice AMY HILGENDORF, PHD KATE WESTABY, MS VICTORIA FAUST, MPA UNIVERSITY OF WISCONSIN-MADISON American Evaluation Association.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
KRISTA COLLINS CLAREMONT GRADUATE UNIVERSITY NOVEMBER 3, 2011 Is Empowerment Evaluation Empowering?
Gender Focal Point Network Training & Orientation
Curriculum (Article 6) Teachers should be involved in all phases of curriculum development ..(design, piloting, implementation and review). Promote understanding.
Building Organizational Capacity to Create Community Change
Auditing Sustainable Development Goals
Post-YU Trilateral Bottom-Up Learning – PYTBUL Ljubljana,
Evaluating ESD in RCEs: The Start-up Tools
Introduction on the outline and objectives of the workshop
Evaluating Comprehensive Community Initiatives
Evaluating Comprehensive Community Initiatives
Blueprint Outlines practical, consumer-focused, state and local strategies for improving eating and physical activity that will lead to healthier lives.
CHANGE IS INEVITABLE, PROGRESS IS A CHOICE
Presentation transcript:

E MPOWERMENT E VALUATION December 9, 2011 EPS 654: Program Evaluation Katherine Coder

“Empowerment evaluation helps transform the potential energy of a community into kinetic energy.” ⌘ Fetterman & Wandersman, 2007, p. 182

Wandersman et al. (2005) define empowerment evaluation as “an evaluation approach that aims to increase the probability of achieving program success by (1) providing program stakeholders with tools for assessing the planning, implementation, and self-evaluation of their program, and (2) mainstreaming evaluation as part of the planning and management of the program/organization” (p. 28).

Key Concepts (Fetterman, 2007) collect evidence have a “critical friend” develop a culture of evidence establish cycles of reflection and action cultivate a community of learners contribute to the development of reflective practitioners

10 Key Principles (Fetterman & Wandersman, 2007) improvement community ownership inclusion democratic participation social justice (removing inequities) community based knowledge evidence-based strategies capacity building organizational learning accountability

EE: 3-Step Approach (Fetterman & Wandersman, 2007) 1. Develop the Mission: determine mission statement and group values through a democratic process whereby meaning can be made and voice is given fairly 2. Taking Stock: list the current activities and use a cooperative process to determine the organizational priorities. First take a baseline account of how the organization is doing on priority issues. After the organization has implemented interventions, another iterative process of Taking Stock is completing to measure success. 3. Plan for the Future: establish the goals, strategies, and identify evidence/indicators. This step represents the intervention phase of the process. After the intervention, return to the next round of Taking Stock to evaluate the success of the program. The EE process is geared toward institutionalization where iterative rounds of Taking Stock and Planning for the Future occur thus developing a learning culture.

10-Step Getting to Outcomes (GTO) (Fetterman & Wandersman, 2007) This approach asks 10 questions and helps those using this system find the relevant literature, methods, and tools. 1. What are the needs and resources in your organization, school, community, or state? (needs assessment: resource assessment) 2. What are the goals, target population, and desired outcomes (objectives) for your school/community/state? (goal setting) 3. How does the intervention incorporate knowledge of science and best practices in this area? (science and best practices) 4. How does the intervention fit with other programs already being offered? (collaboration; cultural competence) 5. What capacities do you need to put this intervention into place with quality? (capacity building) 6. How will this intervention be carried out? (planning) 7. How will the quality of implementation be assessed? (process evaluation) 8. How well did the intervention work? (outcome and impact evaluation) 9. How will continuous quality improvement strategies be incorporated? (total quality management; continuous quality improvement) 10. If the intervention is (or components are) successful, how will the intervention be sustained? (sustainability and institutionalization)

EE Tools (Fetterman, 2007; Fettermen & Wandersman, 2007) EE encourages the use of technologies that align with the principles of EE including online surveys digital photos blogs picture sharing collaborative web sites You Tube videoconferencing spreadsheets

Critiques of Empowerment Evaluation (Donaldson, Patton, Fetterman, & Scriven, 2010; Fetterman & Wandersman, 2007) A number of critiques of EE have been voiced including the ideas of conceptual ambiguity, methodological specificity, and outcomes empowering others (creates an empowering setting) advocacy (evaluators are not necessarily advocates/no neutral eval) consumers (focuses too little on participants) compatibility (internal and external) and trad’l and EE practical or transformative forms EE as evaluation (not a movement) bias (self-serving) social agenda (yes) ideology (not a methodology) differences between collaborative, participatory, and EE

EE as a discipline is said (as of 2005) to be growing in 4 key areas: Defining the field Clarifying EE concepts and principles Methodological specificity (3-step & 10-step, ++) Documenting outcomes As of 2007, Fetterman and Wandersman see EE as strengthening in 4 areas: Combining quantitative & qualitative data Capturing the critical “ah-hah” moments more systematically Translating EE into policy language Learning how to build more refined EE tools and systems

EE Example: Bridging the Digital Divide/Tribal Digital Village HP awarded $15 million to 3 digital villages in the US one of which was a village consisting of 18 Native American tribes.

Tribal Digital Village Outcomes: Creation of the largest unlicensed wireless network in the country (as part of their own sovereign nation); Efforts were recognized and lauded by the head of the FCC Training their young people how to maintain the network (building capacity) Secured an E-rate grant, providing $1 million/yr from the telephone companies toward building and maintaining the network Creation of a high-end digital printing press—a small business enterprise representing a contribution to economic sustainability Implementation of a parent involvement and education center Video recording workshops have enabled the recording of native history through personal family stories

References Donaldson, S. I., Patton, M. Q., Fetterman, D. M., & Scriven, M. (2010). The 2009 Claremont debates: The promise and pitfalls of utilization-focused and empowerment evaluation. Journal of Multidisciplinary Evaluation, 6 (13), Fetterman, D. M. (2002). Empowerment evaluation: Building communities of practice and a culture of learning. American Journal of Community Psychology, 30 (1), Fetterman, D. M. (2005). Empowerment and ethnographic evaluation: Hewlett-Packard’s $15 million digital divide project (a case example). NAPA Bulletin, 24, Fetterman, D. M. (2007, July). Empowerment Evaluation. Australasian Evaluation Society. [PowerPoint Presentation]. Retrieved from homepage.mac.com/ profdavidf/ documents/Canberra.pdf Fetterman, D., & Wandersman, A (2007). Empowerment evaluation: Yesterday, today, and tomorrow. American Journal of Evaluation, 28 (2),

T HANK Y OU ! Questions???