Presentation is loading. Please wait.

Presentation is loading. Please wait.

Implementing Evaluations Strategies for Success

Similar presentations


Presentation on theme: "Implementing Evaluations Strategies for Success"— Presentation transcript:

1 Implementing Evaluations Strategies for Success
American Evaluation Association Conference Skill-Building Workshop #764, Capistrano A November 4, 2011 (4:30-6:00 pm) CDC Battelle Sheri Disler Robin Shrestha-Kuwahara Joanne Abed Carlyn Orians Linda Winges Shyanika Rose

2 Implementing Evaluations  Strategies for Success
Material for this session: Developed for grantees of the National Asthma Control Program in response to grantee requests. The self-study guide Learning and Growing Through Evaluation (Module 2 materials used for this workshop) available at: Suitable as a capacity-building workshop for: New evaluator training In-house organizational capacity-building Evaluation technical assistance provision Useful for reflecting on evaluation practice

3 Session Objectives By the end of this session, participants will be able to: Recognize 5 Critical Areas that should be considered when implementing an evaluation. Anticipate and trouble-shoot the kinds of challenges that may crop up during an evaluation. Identify 9 Best Practices that can inform evaluation planning to help ensure that an evaluation runs smoothly and achieves desired results.

4 PLAN Ahead

5 Exercise 1 5 Critical Areas of evaluation implementation

6 5 Critical Areas of Evaluation Implementation
Evaluation Context Recognize that evaluations exist within an organizational hierarchy and are embedded within a community. Evaluation Logistics Recognize that an evaluation needs to be managed like any other project. Data Collection/Data Compilation Recognize that challenges can occur whether collecting new or compiling existing data. Data Analysis Avoid collecting data that cannot be analyzed or do not meet programmatic and stakeholder information needs. Dissemination of Evaluation Findings Avoid producing findings that are not useful, not acceptable, not believable, or come too late to meet information needs and inform programmatic decision- making.

7 Worksheet 1 What are some specific challenges in each Critical Area?

8 Challenges in Evaluation Context
Negative community response Lack of political will Changes in program priorities Lack of support from program leadership

9 Challenges in Evaluation Logistics
Difficulty communicating with contractors, partners, stakeholders Confusion about roles and responsibilities Insufficient financial resources to complete Inadequate staffing resources to complete Evaluation goes off track in terms of scope, budget, and/or timeline

10 Challenges in Data Collection/ Data Compilation
Ineffective instruments and procedures Lack of access to needed data Difficulties recruiting participants/respondents Difficulties working with data collection contractors Difficulties working with data collection partners Difficulties managing volume of incoming data Ethical breaches Loss or corruption of data

11 Challenges in Data Analysis
Data not useful Confusion over how to analyze data Preliminary findings indicate need for program modifications

12 Challenges in Dissemination of Findings
Late timing of evaluation data with respect to information needs Findings not used Findings not welcomed Findings from evaluation have implications for strategic evaluation plan

13 Exercise 2 Taking Charge of Your evaluation

14 Taking Charge of Your Evaluation
What we’ve learned so far: While an evaluation can face many challenges, the majority of the challenges fall into one of the 5 Critical Areas, so we need to pay special attention to: Evaluation context Evaluation logistics Data collection Data analysis Dissemination What we’ll learn through this exercise: An ounce of prevention is worth a pound of cure! We can troubleshoot challenges that occur as we implement an evaluation. Better yet, we can prevent many from happening in the first place.

15 Lack of Interest/Support from Program Leaders
Lack of Interest/Support from Program Leaders Example for Evaluation Context What can we do if we encounter it? Communicate with leaders and keep them informed about the evaluation through regular progress monitoring reports Consider alternate dissemination methods more concise/accessible than detailed evaluation reports If leadership changes, communicate with new leaders and solicit their input What might we have done to prevent it? Include frontline program leaders in strategic evaluation planning sessions Include frontline program leaders in stakeholder discussions about individual evaluations as they are planned and implemented

16 Insufficient Financial Resources to Complete Evaluation
Insufficient Financial Resources to Complete Evaluation Example for Evaluation Logistics What can we do if we encounter it? Manage the evaluation carefully, regularly monitoring evaluation budget during implementation Consider reductions in scope if budget burns too quickly Document lessons learned through any cost-saving measures used and their effectiveness Keep track of resources spent to help generate realistic estimates for future evaluations What might we have done to prevent it? Have resource estimates developed by individuals experienced in evaluation Consider efficiencies across evaluations through strategic planning Allow for “wiggle room” in your budget in case surprises occur

17 Inefficient Instruments / Strategies
Inefficient Instruments / Strategies Example for Data Collection/Compilation What can we do if we encounter it? Train all data collection staff Regularly monitor data collection activities to detect emerging problems What might we have done to prevent it? Consider utilizing or adapting existing instruments already tested For new instruments, include stakeholders and individuals experienced in evaluation in their design Pilot test instruments Use multiple methods to triangulate

18 Data Collected Not Useful Example for Data Analysis
What can we do if we encounter it? Revise data collection instruments or clarify instructions Discuss findings and implications with stakeholders post-evaluation Work with stakeholders to address evaluation findings in an action plan What might we have done to prevent it? Discuss with stakeholders their information needs and priorities and incorporate into evaluation plan Identify what stakeholders view as credible evidence Specify how data analyses will help answer evaluation questions Pilot test instruments and revise as necessary

19 Findings Not Welcomed by Stakeholders
Findings Not Welcomed by Stakeholders Example for Dissemination of Findings What can we do if we encounter it? Communicate with stakeholders throughout to avoid surprises Discuss findings and implications with stakeholders, emphasizing constructive action that can be taken Document strategies to address findings in an action plan What might we have done to prevent it? Upfront, discuss with stakeholders their information needs and priorities and incorporate into evaluation plan Upfront, discuss with stakeholders how to handle findings that suggest a need for program modification Consider alternative modes of dissemination that may be more useful and accessible than an evaluation report

20 Worksheet 2 What can we do to address challenges that crop up during implementation of an evaluation? What might we have done to prevent them from happening in the first place?

21 Exercise 3 Identifying Best Practices

22 Identifying Evaluation Best Practices
What we’ve learned so far: An evaluation can face many challenges. Most occur in the 5 Critical Areas: context, logistics, data collection, data analysis, and dissemination. Some challenges can be addressed as they occur. Many more can be prevented from occurring in the first place … Through careful planning and detailed documentation of planning decisions in the individual evaluation plan. What we’ll learn through this exercise: A relatively small number of Best Practices can go a long way toward addressing the majority of potential challenges.

23 Highlighted Words on 5 Examples in Example #2
Manage, keep track Communicate, progress monitoring Alternate dissemination methods Stakeholders, include, discussions Document lessons learned, individuals experienced in evaluation Strategic planning Train Pilot test Action plan

24 9 Evaluation Best Practices
Work with stakeholders throughout the evaluation life cycle Manage the evaluation as you would any other project Pilot test data collection instruments and procedures Train data collection staff Monitor progress and communicate frequently with key team members and stakeholders throughout the evaluation Disseminate results to stakeholders, including interim reporting and alternate formats Develop an action plan for implementing recommendations and include stakeholders in the process Document lessons learned Link back to your strategic plan after each new evaluation

25 Questions?


Download ppt "Implementing Evaluations Strategies for Success"

Similar presentations


Ads by Google