How to Conduct Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.

Slides:



Advertisements
Similar presentations
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Advertisements

School Improvement Through Capacity Building The PLC Process.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Donald T. Simeon Caribbean Health Research Council
2014 AmeriCorps State and National Symposium Basic Steps in Conducting an Evaluation.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
How to Develop a Program Logic Model. Learning objectives By the end of this presentation, you will be able to: Describe what a logic model is, and how.
August 15, 2012 Fontana Unified School District Superintendent, Cali Olsen-Binks Associate Superintendent, Oscar Dueñas Director, Human Resources, Mark.
Project Monitoring Evaluation and Assessment
How to Write an Evaluation Plan
How to Manage an External Evaluation. Learning objectives By the end of this presentation, you will be able to: Understand the importance of managing.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
1 Strengthening Child Welfare Supervision as a Key Practice Change Strategy Unit I: Helping Child Welfare Leaders Re-conceptualize Supervision.
Evaluation. Practical Evaluation Michael Quinn Patton.
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Continuous Quality Improvement (CQI)
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Assisting Students with Disabilities: A Training Program
2014 AmeriCorps External Reviewer Training
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
Reporting and Using Evaluation Results Presented on 6/18/15.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Military Family Services Program Participant Survey Training Presentation.
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Converting from Military to Civilian Employment Presented By Sophia Alexander Stringfield.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Why Do State and Federal Programs Require a Needs Assessment?
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Basic Steps in Conducting an Evaluation Dr. Andrea Robles, CNCS Office of Research and Evaluation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Using Evaluation Results and Building a Long-Term Research Agenda
Writing an Evaluation Plan
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AMD COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
1 Community-Based Care Readiness Assessment and Peer Review Overview Department of Children and Families And Florida Mental Health Institute.
California Department of Public Health / 1 CALIFORNIA DEPARTMENT OF PUBLIC HEALTH Standards and Guidelines for Healthcare Surge during Emergencies How.
Session 2: Developing a Comprehensive M&E Work Plan.
How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Department of Political Science & Sociology North South University
Presentation transcript:

How to Conduct Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation

Learning objectives By the end of this presentation, you will be able to: Describe the basic steps for conducting an evaluation Plan for an evaluation Identify the key components of an evaluation plan Identify approaches for collecting and analyzing data Understand how to communicate and apply findings for program improvement 2

Building evidence of effectiveness 3

Evaluation cycle – Four phases 4

Basic steps for conducting an evaluation PhaseStep Planning Step 1: Build (or Review) a Program Logic Model Step 2: Define Purpose and Scope Step 3: Budget for an Evaluation Step 4: Select an Evaluator Step 5: Develop an Evaluation Plan Implementation Step 6: Collect Data Step 7: Manage the Evaluation Analysis and Reporting Step 8: Analyze Data Step 9: Communicate Findings Action and Improvement Step 10: Apply Findings and Feedback for Program Improvement 5

Time Previous grant cycle Next grant cycle Current grant cycle Year 1 Year 2 Year 3 Build (or Review) a Program Logic Model Collect Data Manage the Evaluation Analyze Data Communicate Findings Apply Findings and Feedback for Program Improvement Analyze Data Budget for an Evaluation Select an Evaluator Define Purpose and Scope Develop an Evaluation Plan Basic steps by grant cycle year

Year 1: Planning phase Planning Build a Program Logic Model Budget for an Evaluation Define Purpose and Scope Select an Evaluator 7 Develop an Evaluation Plan

Step 1: Build a program logic model A logic model can serve as a framework for your written evaluation plan. It can help you focus your evaluation by identifying: Questions want/need answered Aspects of program to evaluate Type of evaluation design Information to collect Measures and data collection methods Evaluation timeframe For more information on logic models, CNCS grantees can refer to the module, “How to Develop a Program Logic Model” located on the Evaluation Resources page. 8

Step 2: Define purpose and scope Each evaluation should have a primary purpose around which it can be designed and planned. Why is the evaluation being done? What do you want to learn? How will the results be used? By whom? Additional things to consider: –Specific program requirements –Resources available to carry out the evaluation 9

Step 2: Define purpose and scope 10

Group exercise: Develop research questions for a veterans job readiness program Exercise The hypothetical veterans program is designed to address unemployment among veterans and their spouses as well as their transition into civilian work and community life. Using the logic model developed for the veterans program, what might be some potential research questions?

Hypothetical AmeriCorps Veterans Program Process Outcomes INPUTSACTIVITIESOUTPUTS Outcomes Short-TermMedium-TermLong-Term What we investWhat we do Direct products from program activities Changes in knowledge, skills, attitudes, opinions Changes in behavior or action that result from participants’ new knowledge Meaningful changes, often in their condition or status in life Funding Staff 100 AmeriCorps State and National members 50 non- AmeriCorps volunteers Research Conduct job readiness workshops Provide job search assistance Provide peer counseling services Provide referrals to transitioning services Educate potential employers # individuals participating in workshops # individuals receiving job search assistance # individuals receiving counseling services # families receiving referrals # employers receiving education Increased confidence in gaining employment Increase in job readiness skills Increased knowledge of effective job search strategies Increased knowledge of community services Increased employer knowledge of hiring benefits Increase in job placement Increased capacity of families to manage transition from military to civilian work and family life Increased adoption of military-friendly practices by employers Individuals maintain stable employment Increased family well-being Employers routinely hire veterans and military spouses

Step 3: Budget for an evaluation Common cost categories: Staff time Materials, equipment, and supplies Travel Data collection 13

Step 3: Budget for an evaluation Consider questions of: Who will conduct it? –If external evaluator, consider what services are and are not included in their cost –If own staff, consider cost of time spent on evaluation relative to programmatic tasks What will it include and how will it be conducted? Will it involve new data collection? –If so, at what time points and where? Who will manage it? 14

Step 4: Select an evaluator An evaluator is an individual or team of people responsible for leading the evaluation. Potential options for an evaluator include: –An external source (e.g., consulting firm, college or university personnel, independent consultant) –An internal source - program staff member(s) 15

Step 4: Select an evaluator A key decision is whether to use an internal staff member or to rely on an external evaluator. Factors to consider when making this decision: –Purpose of the evaluation –Staff workload and expertise –Program resources (e.g., financial, necessary computer software, etc.) –Specific program requirements (e.g., AmeriCorps grantees ≥ $500,000 are required to conduct an independent evaluation) 16

Step 4: Select an evaluator Evaluator’s independence: No conflicts of interest related to the evaluation Able to provide an unbiased assessment of the program’s outcomes/impacts 17

Step 4: Select an evaluator How do you find an external evaluator? Academic settings –Contact individuals at your local college or university Professional settings –American Evaluation Association (AEA) website, click on “Find an Evaluator” tab ( Ask others in your network 18

Step 4: Select an evaluator Consider whether your potential evaluator has - Formal training in evaluation studies Experience evaluating similar programs/interventions Experience that matches the design, methods, and/or approach of your planned evaluation Capacity to handle the scale of your planned evaluation Personal style that fits your program staff or organization 19

What is an evaluation plan? An evaluation plan is a written document that describes how you will evaluate your program: Explains the program model being evaluated Provides detailed instructions for the evaluation Describes and justifies the evaluation approach selected 20

Why develop an evaluation plan? Clarifies what direction the evaluation should take based on priorities, resources, time, and skills Creates shared understanding of the purpose and use of evaluation results Fosters program transparency to stakeholders and decision makers Helps identify whether there are sufficient program resources to carry out the evaluation Facilitates smoother transition when there is staff turnover 21

Step 5: Develop an evaluation plan What should your evaluation plan include? I. Introduction II. Program background III. Research questions IV. Evaluation design V. Sampling methods, measurement tools, and data collection procedures VI. Analysis plan VII. Reporting results approach VIII. Timeline, budget, evaluator qualifications See the Frequently Asked Questions: Evaluation document on the Knowledge Network for more details. 22

Step 5: Develop an evaluation plan I. Introduction The introduction is intended to establish the context of your planned evaluation. It should explain: The problem/issue addressed by the program Your program’s theory of change Purpose of the planned evaluation General approach for planned evaluation 23

Step 5: Develop an evaluation plan II. Program background This section should provide detail about your program model: It should include: Your program’s theory of change Existing research supporting your program’s theory of change Logic model Outcomes of interest that your evaluation will assess 24

Step 5: Develop an evaluation plan III. Key evaluation research question(s) Your evaluation plan should list each of your research question(s) that will be investigated. Your research question(s) should be: Clearly stated Measurable Align with your program’s theory of change and logic model 25

Step 5: Develop an evaluation plan IV. Evaluation design Your plan should detail your selected evaluation design and a rationale for why it will be used. When selecting a specific design, consider the following: –Which design will provide desired information and/or fulfill program requirements? –How feasible is each option? –Are there any ethical concerns to choosing a design? –What are the costs associated with each design option? 26

Step 5: Develop an evaluation plan Two common types of evaluation designs: Process/Implementation design: –Examines how well the program matches its theoretical model –Confirms what the program actually does Outcome/Impact design: –Addresses how a program’s activities relate to changes in participants or beneficiaries –Provides evidence as to whether the program causes observed changes 27

Step 5: Develop an evaluation plan V. Sampling methods, measurement tools, and data collection procedures This section should detail how you will collect or compile data for your evaluation by describing: What/who are the sources of data Types of data to be collected/compiled (e.g., surveys, interviews, administrative data) Sampling methods (if any) When the data will be collected and by whom How the data will be analyzed 28

Year 2: Implementation Implementation 29 Manage the Evaluation Collect Data

Step 6: Collect data Existing data –Internal program data –External datasets or program/administrative data New data –Develop data collection instruments (interview protocols and/or questionnaires) –Conduct interviews –Field surveys 30

Step 7: Manage the evaluation Communicate Monitor Support

Step 7: Manage the evaluation 32 Communicate: Maintain communication throughout the project –Project kick-off meeting –Regular, ongoing meetings to keep the evaluation moving in a timely and efficient manner –Ad hoc meetings to discuss specific topics Monitor: Continually monitor evaluation progress and the evaluator’s work: –Review and provide feedback on deliverables (e.g., evaluation plan, design, instruments, reports) –Enforce the schedule for completing tasks and make adjustments as needed –Assess the evaluator’s skills and performance throughout the evaluation –Keep up with invoicing/payments

Step 7: Manage the evaluation Provide support and feedback as needed –Offer advice and guidance to help troubleshoot issues, as needed –Ensure the evaluator has access to the information required –Provide continuous input and feedback on the evaluator’s work 33

Year 3: Analysis and reporting Analysis and Reporting Analyze Data 34 Communicate Findings

Step 8: Analyze data Quantitative data –Statistical analysis (mean, median, chi-square, t-test, ANOVA, regression, etc.) Qualitative data –Content analysis (cross-site analysis, theme identification, case study descriptions) 35

Step 8: Example data collection and analysis crosswalk Process or Impact Evaluation of Your Program Research question Indicators or Outcome of interest What is collected and how? From whom / data sources? When collected and by whom? How will you analyze the data? 36

Step 8: Example data collection and analysis crosswalk Process Evaluation of a Job Readiness Program for Veterans Research question Indicators What is collected and how? From whom / data sources? When collected and by whom? How will you analyze the data? Is the job readiness program being implemented as designed? a) Member use of program curriculum during workshops b) Duration of workshops c) Participant workshop rates a - c) Members report details about workshops in logs with pre-defined categories of reporting a - b) observations of workshops a - c) Members a - b) Evaluator observes participants in workshops a - c) External evaluator collects the workshop logs quarterly a) Quarterly observations by the evaluator(s) using structured observation protocols a - c) Generate frequencies on use of curriculum; average duration of workshops; and average rate of workshop attendance c) Generate frequencies and averages on quantitative data (e.g., ratings scales, frequency scales) and thematically code and analyze open- ended comments/notes 37

Step 8: Example data collection and analysis crosswalk Impact Evaluation of a Job Readiness Program for Veterans Research question Outcome of interest What is collected and how? From whom / data sources? When collected and by whom? How will you analyze the data? What impact does the job readiness intervention have on veterans’ ability to secure and maintain employment relative to a comparison group? Veterans’ employment status Veterans’ employment status is measured with a survey. 38

Step 8: Example data collection and analysis crosswalk Impact Evaluation of a Job Readiness Program for Veterans Research question Outcome of interest What is collected and how? From whom / data sources? When collected and by whom? How will you analyze the data? What impact does the job readiness intervention have on veterans’ ability to secure and maintain employment relative to a comparison group? Veterans’ employment status Veterans’ employment status is measured with a survey. Veterans participating in the program serves as the intervention group. Veterans receiving no job assistance services serve as the comparison group. The evaluator administers the survey at two time points: -before the job readiness program begins -1 year after the job readiness program is implemented Calculate the difference in average outcome in the intervention group minus the difference in average outcome in the comparison group before and after treatment (difference in differences method) 39

Step 8: Analyze data Consider two questions: What conclusions about the research questions can be drawn from the data that have been analyzed? What does the data suggest about the program’s theory of change? 40

Step 9: Communicate findings Who are the potential target audiences? What information do they need or want? Program staff, agency personnel, stakeholders, beneficiaries, funders, etc. What are potential tools for communicating findings? Formal report, shorter memos, PowerPoint briefings, etc. 41

Step 9: Communicate findings What is an evaluation report? Key product resulting from evaluation A written document that objectively describes: –Program background –Evaluation purpose, methods, procedures, and limitations –Evaluation results –Conclusions and recommendations –Lessons learned –Questions for future research 42

Step 9: Communicate findings When reporting findings, it is important to: Report positive, as well as negative, findings Present results that are not necessarily conclusive, but show promise and warrant further examination Be careful not to overstate your findings 43

Step 9: Communicate findings Other useful products for communication: Executive summary of final report (5-10 pages) Short research briefs (2-4 pages) –Graphics and pictures –Bulleted information Non-technical memos 44

Action and improvement steps: Reporting and utilizing results Action and Improvement Apply Findings and Feedback for Program Improvement 45

Step 10: Apply findings and feedback for program improvement Evaluation findings can support decisions and actions with respect to: Program design, implementation and effectiveness Program improvement Implementing change 46

Basic steps for conducting an evaluation 47 PhaseStep Planning Step 1: Build (or Review) a Program Logic Model Step 2: Define Purpose and Scope Step 3: Budget for an Evaluation Step 4: Select an Evaluator Step 5: Develop an Evaluation Plan Implementation Step 6: Collect Data Step 7: Manage the Evaluation Analysis and Reporting Step 8: Analyze Data Step 9: Communicate Findings Action and Improvement Step 10: Apply Findings and Feedback for Program Improvement

Time Previous grant cycle Next grant cycle Current grant cycle Year 1 Year 2 Year 3 Build (or Review) a Program Logic Model Collect Data Manage the Evaluation Analyze Data Communicate Findings Apply Findings and Feedback for Program Improvement Analyze Data Budget for an Evaluation Select an Evaluator Define Purpose and Scope Develop an Evaluation Plan Basic steps by grant cycle year

Resources on evaluation Go to the National Service Knowledge Network evaluation page for more information: Other courses available: How to Develop a Program Logic Model Overview of Evaluation Designs How to Write an Evaluation Plan Budgeting for Evaluation Data Collection for Evaluation Managing an External Evaluation And more! 49

Evaluation resources page 50

Questions?