Download presentation
Presentation is loading. Please wait.
Published byErica Young Modified over 9 years ago
1
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency
2
2 Presentation Objective Introduce the Paint Product Stewardship Initiative to the key steps in designing the demonstration program evaluation.
3
3 Session Agenda Program Evaluation: Definition, Uses, Types - What is Program Evaluation? - Why Should We Evaluate? Steps In the Evaluation Process I. Select Program to Evaluate II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VI. Select Data Collection Methods VII. Select Evaluation Design VIII. Develop Evaluation Plan
4
4 What is Program Evaluation? Program Evaluation: A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. Performance Measurement: The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures.
5
5 Why Evaluate? Good Program Management: Ensure program goals and objectives are being met. Help prioritize resources by identifying the program services yielding the greatest environmental benefit. Learn what works well, what does not, and why. Learn how the program could be improved. Provide information for accountability purposes: Government Performance and Results Act of 1993: Requires EPA to report schedules for and summaries of evaluations that have been or will be conducted and identify those that influence development of the Agency’s Strategic Plan. Environmental Results Order 5700.7: Requires EPA grant officers and grant recipients to identify outputs and outcomes from grants and connect them to EPA’s strategic plan.
6
6 Steps for Designing an Evaluation VI. Select Data Collection Methods II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VIII. Develop Evaluation Plan VII. Select Evaluation Design I. Select Program to Evaluate
7
7 Assessing Whether to Evaluate Your Program (Evaluability Assessment) 1.Is the program significant enough to merit evaluation? Consider: program size, # of people served, transferability of pilot, undergoing PART 2.Is there sufficient consensus among stakeholders on program’s goals and objectives? 3.Are staff & managers willing to make decisions about or change the program based on evaluation results? 4.Are there sufficient resources (time, money) to conduct an evaluation? 5.Is relevant information on program performance available or can it be obtained? 6.Is an evaluation likely to provide dependable information? 7.Is there a legal requirement to evaluate? (Adapted from Worthen et al. 1997.)
8
8 Steps for Designing an Evaluation II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data I. Select Program to Evaluate VI. Select Data Collection Methods VIII. Develop Evaluation Plan VII. Select Evaluation Design
9
9 Identify Evaluation Team Members Select diverse team members: Individuals responsible for designing, collecting, and reporting information used in the evaluation Individuals with knowledge of the program Individuals with a vested interest in the conduct/impact of the program Individuals with knowledge of evaluation Identify a Skeptic!
10
10 Steps for Designing an Evaluation VI. Select Data Collection Methods II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VIII. Develop Evaluation Plan VII. Select Evaluation Design I. Select Program to Evaluate
11
11 Describe the Program Describe the program using a logic model Use the logic model to: Check assumptions about how the program is supposed to work Brainstorm evaluation questions
12
12
13
13 Elements of the Logic Model Inter- mediate Changes in behavior, practice or decisions. Behavior Inter- mediate Changes in behavior, practice or decisions. Behavior Customer User of the products/ services. Target audience the program is designed to reach. Customer User of the products/ services. Target audience the program is designed to reach. Activities Things you do– activities you plan to conduct in your program. Activities Things you do– activities you plan to conduct in your program. Outputs Product or service delivery/ implementation targets you aim to produce. Outputs Product or service delivery/ implementation targets you aim to produce. Resources/ Inputs: Programmatic investments available to support the program. Resources/ Inputs: Programmatic investments available to support the program. Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Long- term Change in condition. Condition Long- term Change in condition. Condition External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. Outcomes PROGRAM RESULTS FROM PROGRAM WHY HOW
14
14 Outcomes Shorter-term awareness Intermediate behavior Longer-term condition OutputsActivitiesCustomers PPSI Demonstration Program Program Goal: Design, implement and evaluate a fully-funded statewide paint product stewardship program that is cost-effective and environmentally beneficial OUTREACH/EDUCATION Establish relationships/partnerships Implement education/outreach and social marketing projects/campaign Project Implementation Stage Baseline information Program database Awareness of recycled paint and waste hierarchy improves September 13, 2007 MEASUREMENT Collect baseline data Ongoing data collection Interim analysis Consumers Retailers Manufacturers Agencies Environmental Groups Recyclers Planning and Needs Assessment Stage Implementation Stage Use and Transfer Stage Management systems = Collection, Processing, Transportation, Recycling, Disposal Waste Hierarchy = Reduce, Reuse, Recycle, Resource Recovery Education materials Workshops Media Tools for Consumers and Retailers Less waste paint Decisions based on waste hierarchy Interim reports and presentations
15
15 Steps for Designing an Evaluation II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data I. Select Program to Evaluate VI. Select Data Collection Methods VIII. Develop Evaluation Plan VII. Select Evaluation Design
16
16 What are Evaluation Questions? Questions (at any point on the performance spectrum/ logic model) that the evaluation is designed to answer. They should reflect stakeholders’ needs. Evaluation questions are KEY because they: Frame the scope of the evaluation Drive the evaluation design, data collection, and reporting
17
17 Types of Evaluations and Common Evaluation Questions Evaluation TypeCommon Evaluation Questions Design assessment Is the design of the program well formulated, feasible, and likely to achieve the intended goals? Process evaluation or implementation assessment Is the program being delivered as intended to the targeted recipients? Is the program well managed? Outcome evaluation Are desired program outcomes obtained? Did the program produce unintended outcomes? Net impact evaluation Did the program cause the desired impact? Is one approach more effective than another in obtaining the desired outcomes? Cost evaluation What are the specific costs for implementing and operating the program? Is the program cost efficient? Cost effective? Adapted from Evaluation Dialogue Between OMB and Federal Evaluation Leaders: Digging a Bit Deeper into Evaluation Science, April 2005
18
18 The Evaluation Plan What: Brief document describing evaluation purpose, audience, scope, design, & methods. Why: The purpose is to clearly articulate and communicate expectations for the evaluation. Who: Developed by one or more team members based on team’s common understanding. When: Can be developed at any point from initial selection of the program through development of the research design. Plans are living documents and need to be revised to account for changes in evaluation objectives or methods.
19
19 Components of an Evaluation Plan Purpose of the evaluation/ Evaluation questions Primary audience Context (organizational, management, political) Data collection methods and analysis Evaluation design How evaluation findings will be reported Consider different formats for different target audiences Expectations for roles and communication among evaluators, program staff/managers, and key stakeholders Resources available for evaluation (staff, budget) Timeline for evaluation Note: Save sufficient time to develop evaluation questions and analyze data thoroughly.
20
20 Steps for Designing an Evaluation VI. Select Data Collection Methods II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VIII. Develop Evaluation Plan VII. Select Evaluation Design I. Select Program to Evaluate
21
21 Contact Matt Keene (202) 566-2240 Keene.matt@epa.gov Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency
22
22
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.