Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Welcome to Volunteer Management
Introduction to Monitoring and Evaluation
Developing the Learning Contract
Performance Management Review FAQs
Dr. Suzan Ayers Western Michigan University
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Research Plan: Using Data to Create Impactful Pride Campaigns
 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel.
Introduction to Program Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
Objectives Understand the importance of information to the company.
Planning and Strategic Management
Learning Goals Explain the importance of information to the company
Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr.
PPA 502 – Program Evaluation
Putting It all Together Facilitating Learning and Project Groups.
Title I Needs Assessment and Program Evaluation
Recreational Therapy: An Introduction
1 Strengthening Child Welfare Supervision as a Key Practice Change Strategy Unit I: Helping Child Welfare Leaders Re-conceptualize Supervision.
Evaluation. Practical Evaluation Michael Quinn Patton.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
© 2011 Pearson Prentice Hall, Salkind. Nonexperimental Research: Qualitative Methods.
Student Assessment Inventory for School Districts Inventory Planning Training.
Developing an Event Concept
How to Develop the Right Research Questions for Program Evaluation
Reporting and Using Evaluation Results Presented on 6/18/15.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Results-Based Management
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
What is Project Management? How does it affect how you do your job?
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Too expensive Too complicated Too time consuming.
The Marketing Research Process and Proposals
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Outcome Based Evaluation for Digital Library Projects and Services
© 2010 by Nelson Education Ltd.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Project Charters Module 3
Overview of Chapters 11 – 13, & 17
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
STAKEHOLDER MEETING Selecting Interventions to Improve Utilization of the IUD City, Country Date Insert MOH logoInsert Project logoInsert USAID logo (Note:
Module V: Writing Your Sustainability Plan Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 © 2011.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Program Evaluation Principles and Applications PAS 2010.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Program Evaluation Planning Taylor-Powell, E., Steele, S., & Douglah, M. (1996). Planning a program evaluation. Retrieved from University of Wisconsin-Extension-Cooperative.
Planning to Plan Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 © 2011 The Finance Project. Used.
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
Session 2: Developing a Comprehensive M&E Work Plan.
Yvonne Abel, Abt Associates Inc. November 2010, San Antonio, TX Enhancing the Quality of Evaluation Through Collaboration Among Funders, Programs, and.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Conducting a research project. Clarify Aims and Research Questions Conduct Literature Review Describe methodology Design Research Collect DataAnalyse.
1 Analyze Top Savings Opportunities Step #7. 2 After completing the screening & sizing tools, you may use the Deep Dive tracker to identify the biggest.
CHAPTER OVERVIEW The Case Study Ethnographic Research
Designing Effective Evaluation Strategies for Outreach Programs
CHAPTER OVERVIEW The Case Study Ethnographic Research
Presentation transcript:

Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Four considerations Identifying evaluation audiences Identifying evaluation audiences Setting boundaries on whatever is evaluated Setting boundaries on whatever is evaluated Analyzing evaluation resources Analyzing evaluation resources Analyzing the political context Analyzing the political context

1. Audience Identification Evaluation is adequate only if it collects information from and reports to all legitimate evaluation audiences Evaluation is adequate only if it collects information from and reports to all legitimate evaluation audiences Primary Audience: sponsor and client Primary Audience: sponsor and client Secondary audiences: depends on how the evaluator defines constituents Secondary audiences: depends on how the evaluator defines constituents Common to limit to too narrow an audience Common to limit to too narrow an audience Figure 11.1 (p. 202) Figure 11.1 (p. 202) Return to list of audiences periodically Return to list of audiences periodically Who will use results and how is key to outlining study Who will use results and how is key to outlining study

Potential Secondary Audiences Policy makers Policy makers Managers Managers Program funders Program funders Representatives of program employees Representatives of program employees Community members Community members Students and their parents (or other program clients) Students and their parents (or other program clients) Retirees Retirees Reps of influence groups Reps of influence groups

2. Setting the Boundaries Start point: detailed description of the program being evaluated Start point: detailed description of the program being evaluated Program description: describes the critical elements of the program (goals, objectives, activities, target audiences, physical setting, context, personnel) Program description: describes the critical elements of the program (goals, objectives, activities, target audiences, physical setting, context, personnel) Need for description to be thorough enough to convey program’s essence Need for description to be thorough enough to convey program’s essence

Characterizing the Evaluand What problem was program designed to correct? What problem was program designed to correct? Of what does the program consist? Of what does the program consist? What is the program’s setting and context? What is the program’s setting and context? Who participates in the program? Who participates in the program? What is the program’s history? Duration? What is the program’s history? Duration?

When and under what conditions is the program implemented? When and under what conditions is the program implemented? Are there unique contextual events (contract negotiations, budget, elections…) that may distort evaluation? Are there unique contextual events (contract negotiations, budget, elections…) that may distort evaluation? What resources (human, materials, time) are consumed by the program? What resources (human, materials, time) are consumed by the program? Has there been a previous evaluation? Has there been a previous evaluation?

Program Theory Specification of what must be done to achieve desired goals, other impacts may be anticipated, & how goals & impacts would be generated (Chen, 1990) Specification of what must be done to achieve desired goals, other impacts may be anticipated, & how goals & impacts would be generated (Chen, 1990) Serves as a tool for: Serves as a tool for: Understanding program Understanding program Guiding evaluation Guiding evaluation Evaluators must understand assumptions that link problem to resolve with program actions & characteristics & those a/c with desired outcomes Evaluators must understand assumptions that link problem to resolve with program actions & characteristics & those a/c with desired outcomes

Helpful in developing program theory (Rossi, 1971) 1. Causal hypothesis: links problem to a cause 2. Intervention hypothesis: links program actions to the cause 3. Action hypothesis: links the program activities with reduction of original problem Sample Problem Declining fitness levels in children Declining fitness levels in children Causal hypothesis? Causal hypothesis? Intervention hypothesis? Intervention hypothesis? Action hypothesis? Action hypothesis?

Methods for Describing Evaluand Descriptive Documents Descriptive Documents Program documents, proposals for funding, publications, minutes of meetings, etc… Program documents, proposals for funding, publications, minutes of meetings, etc… Interviews Interviews Stakeholders, all relevant audiences Stakeholders, all relevant audiences Observations Observations Observe program in action, get a “feel” for what really is going on Observe program in action, get a “feel” for what really is going on Often reveal difference between how program runs and how it is supposed to run Often reveal difference between how program runs and how it is supposed to run

Challenge of balancing different perspectives Challenge of balancing different perspectives Minor differences may reflect stakeholder values or positions and can be informative Minor differences may reflect stakeholder values or positions and can be informative Major differences require that evaluator attempt to achieve some consensus description of the program before initiating the evaluation Major differences require that evaluator attempt to achieve some consensus description of the program before initiating the evaluation Redescribing evaluand as it changes Redescribing evaluand as it changes Changes may be due to Changes may be due to Responsiveness to feedback Responsiveness to feedback Implementation not quite aligned with designers’ vision Implementation not quite aligned with designers’ vision Natural historical evolution of an evaluand Natural historical evolution of an evaluand

3. Analyzing Evaluation Resources: $ Cost-free evaluation: cost savings realized via evaluation may pay for evaluation over time Cost-free evaluation: cost savings realized via evaluation may pay for evaluation over time If budget limits are set before the evaluation process begins, it will affect planning decisions that follow If budget limits are set before the evaluation process begins, it will affect planning decisions that follow Often evaluator has no input into the budget Often evaluator has no input into the budget Offer 2-3 levels of services (Chevy vs. BMW) Offer 2-3 levels of services (Chevy vs. BMW) Budgets should remain somewhat flexible to allow for evaluation process to focus on new insights during the process Budgets should remain somewhat flexible to allow for evaluation process to focus on new insights during the process

Analyzing Resources- Personnel Can the evaluator use ‘free’ staff on site? Can the evaluator use ‘free’ staff on site? Program staff could collect data Program staff could collect data Secretaries type, search records Secretaries type, search records Grad students doing internship, course-related work Grad students doing internship, course-related work PTA PTA Key that evaluator ORIENT, TRAIN, QC such volunteers to maintain evaluation’s integrity Key that evaluator ORIENT, TRAIN, QC such volunteers to maintain evaluation’s integrity Supervision and spot-checking useful practices Supervision and spot-checking useful practices Task selection is essential to maintain study’s validity/credibility Task selection is essential to maintain study’s validity/credibility

Analyzing Resources: Technology, others, constraints The more information that must be generated by the evaluator, the costlier the evaluation The more information that must be generated by the evaluator, the costlier the evaluation Are existing data, records, evaluations, and other documents available? Are existing data, records, evaluations, and other documents available? Using newer technology, less expensive means of data collection can be employed Using newer technology, less expensive means of data collection can be employed Web-based surveys, s, conference calls, posting final reports on websites Web-based surveys, s, conference calls, posting final reports on websites Time (avoid setting unrealistic timelines) Time (avoid setting unrealistic timelines)

4. Analyzing the Political Context Politics begin with decision to evaluate and influence entire evaluation process Politics begin with decision to evaluate and influence entire evaluation process Who stands to gain/lose most from different evaluation scenarios? Who stands to gain/lose most from different evaluation scenarios? Who has the power in this setting? Who has the power in this setting? How is evaluator expected to relate to different groups? How is evaluator expected to relate to different groups? From which stakeholders will cooperation be required? Are they willing to cooperate? From which stakeholders will cooperation be required? Are they willing to cooperate? Who has vested interest in outcomes? Who has vested interest in outcomes? Who will need to be informed along the way? Who will need to be informed along the way? What safeguards need to be formalized (i.e., IRB)? What safeguards need to be formalized (i.e., IRB)?

Variations Caused by Evaluation Approach Used Variations in the evaluation plan will occur based on the approach taken by the evaluator Variations in the evaluation plan will occur based on the approach taken by the evaluator Each approach has strengths and limitations Each approach has strengths and limitations Review Table 9.1 for characteristics of each Review Table 9.1 for characteristics of each Use of single approaches tends to be limiting Use of single approaches tends to be limiting

To Proceed or Not? Based on information about context, program, stakeholders & resources, decide ‘go/no-go’ Based on information about context, program, stakeholders & resources, decide ‘go/no-go’ Ch. 10 inappropriate evaluation conditions: Ch. 10 inappropriate evaluation conditions: Evaluation would produce trivial information Evaluation would produce trivial information Evaluation results will not be used Evaluation results will not be used Cannot yield useful, valid information Cannot yield useful, valid information Evaluation is premature for the stage of the program Evaluation is premature for the stage of the program Motives of the evaluation are improper Motives of the evaluation are improper Ethical considerations (utility, feasibility, propriety, accuracy) Ethical considerations (utility, feasibility, propriety, accuracy)