Dare to Evaluate Roger A. Rennekamp, Ph.D. Department Head and State 4-H Program Leader Youth Development Education Oregon State University

Slides:



Advertisements
Similar presentations
IB Portfolio Tasks 20% of final grade
Advertisements

Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Program Design Roger A. Rennekamp, Ph.D. Extension Professor and Specialist in Program and Staff Development Department of Community and Leadership Development.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
School Leadership that Works
Chapter 6: Program-Oriented Approaches
Aligning Employee Performance with Agency Mission
What You Will Learn From These Sessions
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Roger A. Rennekamp, Ph.D. Department Head and State 4-H Program Leader Youth Development Education Oregon State University
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
Microsoft 2013 All Rights Reserved. Partners in Learning School Research Background.
OUTCOME MEASUREMENT TRAINING Logic Models OBJECTIVES FOR TODAY: n Recognize and understand components of a logic model n Learn how to create a logic.
School Development Planning Initiative “An initiative for schools by schools” Self-Evaluation of Learning and Teaching Self-Evaluation of Learning and.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Coaching and Providing Feedback for Improved Performance
Develop your Leadership skills
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
SUP-150Y PATHS OF DISCOVERY AND CONFIRMATION: THEORY, THEORY OF CHANGE AND RESEARCH DESIGN.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Outcome Based Evaluation for Digital Library Projects and Services
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Logic Models and Theory of Change Models: Defining and Telling Apart
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
LCES Advisory Leadership System… The role of advisory leadership councils in Extension’s programming process.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Indicators to Measure Progress and Performance IWRM Training Course for the Mekong July 20-31, 2009.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
EVALUATION AND SELECTION Chapter 6. Objectives Upon completion of this chapter, you should be able to: Critically understand the process of strategy evaluation.
Building an Organizational Evaluation Infrastructure / Culture to Improve the Management and Outcomes of Health Programs Judith Hager, MA, MPH Molly Bradshaw,
Outcome-based Planning and Evaluation Gloria Latimer, Ed.S, Director of Community Programs Jason Vahling, M.P.H., Community Program Specialist.
Getting to Outcomes: How to do strategic planning with your CRP Theresa Costello National Resource Center for Child Protective Services May 25, 2007.
Evaluation Communities of Practice: Building capacity from the inside out Ben Silliman, Department of Youth, Family, & Community Sciences NC State University.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
Giving Them our Best: 4-H Professional Development Logic Model Outcome: 4-H educators reflect quality, distinction and leadership in the field of youth.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
FORMULATING EXTENSION POLICY, KENYA’S EXPERIENCE BY MARY KAMAU DIRECTOR, EXTENSION & TRAINING MINISTRY OF AGRICULTURE KENYA DURING THE EXTENSION WORKSHOP,
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Logic Models Performance Framework for Evaluating Programs in Extension.
true potential An Introduction to the First Line Manager Programme’s CMI Qualifications.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Program Design Global Health Fellowship St Luke’s/Roosevelt New York.
Stages of Research and Development
Managing for Results Capacity in Higher Education Institutions
Investment Logic Mapping – An Evaluative Tool with Zing
Grant Champions Workshop Logic Model Development
MSc in Social Research Methods
Utilizing the LOGIC MODEL for Program Design and Evaluation
Ross O. Love Oklahoma Cooperative Extension Service
Short term Medium term Long term
Logic Models and Theory of Change Models: Defining and Telling Apart
Parent-Teacher Partnerships for Student Success
Primary Executive Headteacher (PEH) provision
Logic modeling.
Project Title: (PEARS Action Plan-Step 1)
Resources Activity Measures Outcomes
Using Logic Models in Project Proposals
Presentation transcript:

Dare to Evaluate Roger A. Rennekamp, Ph.D. Department Head and State 4-H Program Leader Youth Development Education Oregon State University

Reflections on a Quarter Century of Extension Evaluation Practice Twenty-five years have passed since the Journal of Extension published its landmark issue dedicated to program evaluation The issue served as a “call to action” following several critical investigations of Cooperative Extension that concluded that Extension… is “no longer relevant” has “no clearly defined focus” is “short on impacts” and “long on documenting participation and activity”

The Challenge in 1983 It can no longer be taken for granted that programs are good and appropriate. Extension is operating in a new environment – an environment more open to criticism and demands for justification of actions. All publicly funded agencies, not just extension, are vulnerable in these times. In an era of accountability, Extension must be able to document who and how people are being served. It also needs to document that programs are achieving positive results. (Andrews, 1983)

Three Areas of Progress The use of logic modeling has become widespread. Capacity to conduct evaluation increased markedly. Data for decision making is readily available.

The New “Call to Action” Logic modeling must be better understood. Build capacity for increased rigor in evaluation. Rethink the purpose of evaluation in Extension.

A logic model is… A. a framework for program planning that links inputs and activities to program outcomes B. useful in formulating evaluation questions C. a graphic representation of the theory which underlies a program D. all of the above.

Put the Logic into Logic Models Significant evolution in thinking about programming planning from Bennett (1975) to Boyle (1981) to Boone (1985) Bennett and Rockwell (1995) Logic Modeling (Taylor-Powell) Widely adopted as a model for program planning and framework for evaluation Logic models are more than “templates for preparing plans of work” or “forms to be filled out”

Inputs OutputsOutcomes ActivitiesParticipationInitialIntermediateLong-Term Resources deployed to address situation Staff Volunteers Time Money Materials Equipment Technology Partners Activities supported by resources invested Workshops Meetings Field Days Demonstration Camps Trainings Web Sites Home Visits Individuals or groups who participate in the activities Number Characteristics Reactions Learning that results from participation Awareness Knowledge Opinions Skills Aspirations Actions that results from learning Practices Behaviors Policies Social Action Choices Conditions which change as a result of action Social Economic Environmental Contextual Factors

Put the Logic into Logic Models Logic should represent an underlying theory for how a program should operate They are “pictures” of programs Implicit program theory becomes explicit Linkages between inputs, outputs and outcomes can be based on research, intuition, experience, and at times, untested assumptions. As these linkages are confirmed, the theory becomes increasingly sound and mature.

The degree of rigor built into my evaluations is most frequently influenced by… A. my level of knowledge and skill in program evaluation. B. relative need for accuracy and confidence in the evaluation results. C. resource limitations. D.lack of technical assistance with evaluation.

Build Capacity for Rigor Rigor is about the technical qualities of an evaluation that make it convincing. How much rigor is necessary? A bad evaluation might be worse than no evaluation at all. But an overly sophisticated evaluation may waste precious resources. Decisions about rigor depend on the need for precision, need for acceptance of results, and the need to generalize findings.

Build Capacity for Rigor Is training the answer? Puts the burden on field staff. Is hiring program evaluators the answer? Puts the burden on evaluators. New approaches suggest that individual development and organizational development go hand in hand, using experiential approaches where the evaluator serves as an evaluation coach, working hand-in-hand with program staff.

The purpose for which I most frequently conduct evaluations is… A. to generate impact data for stakeholders. B. to improve the program. C. to better understand how the program works and advance the field. D.to assess the need for the program.

Rethink Evaluation Purpose Is the goal of evaluation to prove or improve? Sometimes we approach evaluation as having something to prove. Other times we approach evaluation with the aim of discovering new information that will help improve the program. Perhaps we are a bit out of balance.

Rethink Evaluation Purpose Joan Thomson (1983, p. 3), then editor of the Journal of Extension, wrote in her introductory notes to the evaluation issue that the “rationale for conducting Extension program evaluation in today’s complex environment…is often overshadowed by a suspicion of who, why, and for what is Extension being questioned.” Consequently, individual and organizational learning took a back seat to countering the criticism that had been levied against Extension.

Rethink Evaluation Purpose Evaluation questions can come from any place on the logic model. If we know that A→B → C, why keep measuring C? Important implications for program quality standards and measures. Rather, ask “What single piece of information, if known, would strengthen confidence in your program?” Strengthen the program, strengthen the theory, strengthen the field.

Conclusion Deep understanding of a program’s theory of change is essential to sound programming. Increased understanding of theory results in more relevant evaluation questions. Consequently, Extension becomes increasingly able to provide valid and reliable data for decision making.

Conclusion Learning organizations have a hunger for new information that makes them more efficient and effective. Through evaluation, members of the organization gain new information, insights, and perspectives on their programs that enable them to work in new ways. As they do, they rise to new levels of personal effectiveness and facilitate peak organizational performance.

References Braverman, M.T., Engle, M., Arnold, M.E., and Rennekamp, R.A. (Eds.). (2008). Program evaluation in a complex organizational system: Lessons from Cooperative Extension. New Directions for Evaluation, 120. Jossey-Bass. Rennekamp, R.A. and Arnold, M.E. (2009). What Progress, Program Evaluation? Reflections on a Quarter-Century of Evaluation Practice in Extension. Commentary. Journal of Extension. In Press.