Download presentation
Presentation is loading. Please wait.
Published byMyron Asher Pearson Modified over 9 years ago
1
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene, Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation US Environmental Protection Agency
2
2 Purpose of the Presentation Provide the PPSI with an understanding of the work of the evaluation committee and the process of integrating evaluation into the design of the Minnesota Demonstration Project.
3
3 Presentation Outline 1.Introduction to Program Evaluation 2.The Evaluation Committee & Goal 6 3.Integrating Evaluation into MN 4.Questions, Comments, and Next Steps
4
4 Program Evaluation Definition A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. Orientation/Approaches to Evaluation Accountability External Audience Learning & Program Improvement Internal/External Audiences
5
5 Measurement and Evaluation Program Evaluation: A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. Performance Measurement: A basic and necessary component of program evaluation that consists of the ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures.
6
6 The Evaluation Committee & Goal 6 Evaluation Team (Committee) Purpose and Function of the Evaluation Team Funding and Support The Work Plan and Goal 6
7
7 The Evaluation Committee & Goal 6 What will we evaluate? Paint, Management Systems, Education, Markets Why are we evaluating the program? Leadership, Learning, Transfer Can we evaluate this project? We must integrate evaluation into the project We need a framework to follow…and we are building it as we go Initially, integrating evaluation into your program is a design and planning activity
8
1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into The Minnesota Project Integrating Evaluation into Program Design Program QuestionsDocumentation Measures 1. Context 2. Audience 3. Communication 4. Use 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Evaluation Policy 2. Evaluation Methodology
9
is our program Describing the MN program Mission Goals and objectives Logic model Select and Describe the Program 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design
10
What are the critical questions to understanding the success of the MN program? Evaluation Questions
11
Examples of Draft Questions Has this been a collaborative and cooperative process? How successful is the PSO? How effective are education and outreach materials? How effective are the paint management systems? What are the best options for a national system?
12
What contextual factors may influence the answers to each question? Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question? Evaluation Questions 1. Context 2. Audience 3. Communication 4. Use
13
Evaluation Questions What are the critical questions to understanding the success of the MN program? What contextual factors may influence the answers to each question? Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question?
14
What can we measure to answer each question? Where can we find the information for each measure? How can we collect the information? Given our questions and information to be collected, what will be an effective collection strategy? Performance Measures 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management Measures
15
What analytical tools will give us the most useful information? How will we implement the collection strategy? How will we manage the data? Performance Measures Measures
16
What can we measure to answer each question? Where can we find the information for each measure? What methods are best suited for each measure? What analytical tools will give us the most useful information? Given our questions and information to be collected, what will be our collection strategy? How will we implement the collection strategy? How will we manage the data? Performance Measures
17
Documentation: Methodology & Policy Evaluation Methodology The process of integrating evaluation generates a framework for our methodology Evaluation Policy Guide MN & PPSI Guides strategy and planning for evaluation and program management 1. Evaluation Methodology 2. Evaluation Policy Measures Documentation
18
Check the Logic and Flow Revisit the process and the decisions made Look for the flow in the process and identify potential breaks Identify potential obstacles to our approach to understanding and managing the performance of the MN demonstration program 1 st cycle is integrating – next cycle begins implementation
19
19 Questions, Comments and Clarifications 1.Questions for the Demonstration Committee 2.Questions for the Evaluation Committee 1.Introduction to Program Evaluation 2.The Evaluation Committee & Goal 6 3.Integrating Evaluation into MN
20
20 Thank You! Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency Matt Keene (202) 566-2240 Keene.Matt@epa.gov www.epa.gov/evaluate
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.