Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Introduction to Monitoring and Evaluation
Workshop Purpose To provide opportunities to enable you to acquire the knowledge and skills to design and develop learning programmes to satisfy organisational.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Mywish K. Maredia Michigan State University
What You Will Learn From These Sessions
Dr. Suzan Ayers Western Michigan University
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Evaluation.
Return On Investment Integrated Monitoring and Evaluation Framework.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr.
Tailoring Evaluations
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
3 Chapter Needs Assessment.
Logic Modeling for Success Dr Kathryn Wehrmann Illinois State University.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
Purpose of the Standards
The Research Problem and Objectives Lecture 6 1. Organization of this lecture Research Problem & Objectives: Research and Decision/Action Problems Importance.
Continuous Quality Improvement (CQI)
National Food Service Management Institute
Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
How to Develop the Right Research Questions for Program Evaluation
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Too expensive Too complicated Too time consuming.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Outcome Based Evaluation for Digital Library Projects and Services
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Copyright © 2011 Pearson Education, Inc. All rights reserved. Planning, Applying, and Evaluating a Treatment Program Chapter 24.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Overview of Chapters 11 – 13, & 17
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
1 A Process for Developing the Structure of Need-Based Web Sites: Technical Report 29 Julia Panke, Darrin L. Carr, Scott Arkin, & James P. Sampson, Jr.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
The Research Problem and Objectives Lecture 6 1. Organization of this lecture Research Problem & Objectives: Research and Decision/Action Problems Importance.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Unit 1 Module 4 Explain: Roles of the Evaluator Introduction to Educational Evaluation Dr. Kristin Koskey.
Community-Based Deer Management Collaborative Deer Management Outreach Initiative.
Planning and Organizing Chapter 13. The Planning Function Planning for a business should stem from the company’s Business Plan – The business plan sets.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Summer Institutes Level 1 FRMCA Level 1, Chapter 7 Communication.
Chapter 23: Overview of the Occupational Therapy Process and Outcomes
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
Introduction Social ecological approach to behavior change
Evaluation What is evaluation?
CHAPTER 11 Crafting a Research Proposal. The purposes of a proposal To make clear the aim of the proposed research and establish its importance To specify.
Stages of Research and Development
Introduction Social ecological approach to behavior change
Group evaluation There is need to assess the degree to which a group is achieving or has achieved its set goals. The process of assessing this constitutes.
Right-sized Evaluation
Introduction to Program Evaluation
Business Retention and Expansion
Outcomes Based Evaluation
Review: Introduction Define Evaluation
Presentation transcript:

Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach Describe one benefit of each approach Which approach focuses on the marginalized? Which approach focuses on the marginalized? What were the five cautions the authors shared about the alternative approaches to evaluation? What were the five cautions the authors shared about the alternative approaches to evaluation?

Guidelines for Planning Evaluations: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Individuals who Affect or are Affected by an Evaluation Study Sponsor: authorizes the evaluation, provides resources for its conduct Sponsor: authorizes the evaluation, provides resources for its conduct Client: requests the evaluation Client: requests the evaluation Stakeholders: those who have a stake in the program or in the evaluation’s results Stakeholders: those who have a stake in the program or in the evaluation’s results Audiences: individuals, groups, agencies who have an interest in the evaluation and receive its results Audiences: individuals, groups, agencies who have an interest in the evaluation and receive its results

Understanding Reasons for Initiating Evaluation Understanding the purpose of the evaluation is an important first step Understanding the purpose of the evaluation is an important first step –Did a problem prompt the evaluation? –Did some stakeholder demand it? –Who has the need to know? –What does s/he want to know? Why? –How will s/he use the results?

It is not uncommon for the clients to be uninformed about evaluation procedures and to have not given deep thought about the ramifications It is not uncommon for the clients to be uninformed about evaluation procedures and to have not given deep thought about the ramifications Frequently, the purpose is not clear until the evaluator has carefully read the relevant materials, observed the evaluation object, and interviewed stakeholders Frequently, the purpose is not clear until the evaluator has carefully read the relevant materials, observed the evaluation object, and interviewed stakeholders

Practical Application to YOUR Plan: Questions to Begin 1) Why is this evaluation being requested? What questions will it answer? 2) To what use will the evaluation findings be put? By whom? What others should receive the information? 3) What is to be evaluated? What does it include? Exclude? During what time period? In what settings? Who will participate?

4) What are the essential program activities? How do they link with the goals and objectives? What is the program theory? 5) How much time and money are available for the evaluation? Who can help with it? Is any information needed immediately? 6) What is the political climate and context surrounding the evaluation? Will any political factors and forces interfere in gaining meaningful and fair information?

Informational Uses of Evaluation Needs Assessment Needs Assessment –Determine whether sufficient need exists to initiate a program and describe the target audience –Assist in program planning by identifying potential program models Monitoring/Process Study Monitoring/Process Study –Describe program implementation and whether changes from the initial model have occurred Outcomes Study Outcomes Study –Examine whether certain goals are being achieved at desired levels –from the initial model have occurred Cost Effectiveness Study Cost Effectiveness Study –Judge overall program value & its relative cost:value ratio compared to competing programs

Noninformational Uses Postponement of a decision Postponement of a decision Ducking responsibility [know decision already but need to make it look good] Ducking responsibility [know decision already but need to make it look good] Public Relations [justify the program] Public Relations [justify the program] Fulfilling grant requirements Fulfilling grant requirements Covert, nefarious, political uses of information: Covert, nefarious, political uses of information: –Typically more common in federal/national evaluations

Conditions under which evaluation studies are inappropriate Evaluation would produce trivial information Evaluation would produce trivial information –Low impact program, one-time effort Evaluation results will not be used Evaluation results will not be used –Regardless of outcome, political appeal/public support… Cannot yield useful, valid information (bad worse than none) Cannot yield useful, valid information (bad worse than none) –Well-intentioned efforts, “mission impossible” evals Evaluation is premature for the stage of the program Evaluation is premature for the stage of the program –Fitness program evaluation in first 6 weeks will not yield meaningful information –Premature summative evals most insidious misuse of evaluation Motives of the evaluation are improper Motives of the evaluation are improper –Ethical considerations, “hatchet jobs” (propriety: eval respects rights & dignity of data sources; help organizations address all clients’ needs)

Determining Appropriateness Use a tool called evaluability assessment Use a tool called evaluability assessment –Clarify the intended program model or theory –Examine the program implementation to determine whether it matches the program model and could achieve the program goals –Explore different evaluation approaches to match needs of stakeholders –Agree on evaluation priorities and intended uses of the study

Methods Create working group to clarify program model or theory, define information needs, evaluation expectations Create working group to clarify program model or theory, define information needs, evaluation expectations –Personal interviews with stakeholders –Reviews of existing program documentation –Site visits Figure 10.1 (p. 186): checklist to determine when to conduct an evaluation Figure 10.1 (p. 186): checklist to determine when to conduct an evaluation

Who will Evaluate? External External –impartial, credible, expertise, fresh look –participants may be more willing to reveal sensitive information to outsiders –more comfort presenting unpopular information/advocating changes, etc. Internal Internal –Knowledge of program, history, context, etc. –familiarity with stakeholders –Serve as advocates to use findings –quick start up –Known quantity

Combination Combination –Internal collect contextual information –Internal collect data –External directs data collection, organizes report –Internal is there to advocate and support after external is gone

Evaluator Qualifications/Skills Does evaluator have the ability to use methodologies and techniques needed in the study? Does evaluator have the ability to use methodologies and techniques needed in the study? ….have the ability to help articulate the appropriate focus for the study? ….have the ability to help articulate the appropriate focus for the study? ….have the management skills to carry out the study? ….have the management skills to carry out the study? …maintain proper ethical standards? …maintain proper ethical standards? …communicate results to audiences so that they will be used? …communicate results to audiences so that they will be used?