Warren Leon Clean Energy States Alliance 1. 2 You will be connected to audio using your computer’s microphone and speakers (VoIP) or through your headset.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Page 1 of 5 UWA Service Desk The Service Desk self service portal allows you (staff or student) to not only monitor the progress of any Incident or request.
Thurston County Public Health & Social Services Department Public Health and Social Services Department Assessment Team Meeting.
Donald T. Simeon Caribbean Health Research Council
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Mywish K. Maredia Michigan State University
Project leaders will keep track of team progress using an A3 Report.
Designing an Effective Evaluation Strategy
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
BASELINE POLICY FRAMEWORK Dina Mackin, CPUC Workshop on Energy Efficiency Baselines April 28, 2015 California Public Utilities Commission1.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
Program Evaluation Spero Manson PhD
We appreciate your patience as our presenters join the webinar. Your meeting today is being hosted by the OFA Grant Review Team. Welcome!
How to Develop the Right Research Questions for Program Evaluation
Effectively applying ISO9001:2000 clauses 5 and 8
Web Analytics and Social Media Monitoring CAM Diploma in Digital Marketing (Metrics and Analytics) Assignment Briefing December 2013 & March 2014 Papers.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
DIPLOMA DISABILITY PLANNING YOUR WORK Gricel Mendez.
Session 8 Early Risk Communication Campaign Planning Session 8 Slide Deck Slide 8-1.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
4.04 Understand marketing- research activities to show command of their nature and scope.
BSBIMN501A QUEENSLAND INTERNATIONAL BUSINESS ACADEMY.
1 Framework Programme 7 Guide for Applicants
Types of evaluation examine different aspects of performance Resources (Inputs) ActivitiesOutputs Short-Term Outcomes Intermediate Outcomes (through customers)
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Progress towards outcomes: Developing a logic model and theory of change 1 Mary Ryan Research in Practice associate Margaret Davies Director, Red Dragonfly.
PLANNING WORKBOOK TUTORIAL MODULE 1 STEPS FOR IMPLEMENTATION OF A HIGHWAY SAFETY PRODUCT FHWA Highway Safety Marketing, Communications, and Outreach Decision.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
1 MODEL ACADEMIC CURRICULUM MODULE 13 Assessing and Evaluating Responses.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Presents: Information for participants: Your microphone will be muted for the formal presentation. If your audio portion the presentation is not working,
Goal Setting and Continuous Improvement.  What will be the goals you set that make a difference for your customers?  What role will you play?  With.
Introduction to policy briefs What is a policy brief? What should be included in a policy brief? How can policy briefs be used? Getting started.
The Marketing Research Process Overview. Learning Objectives  To learn the steps in the marketing research process.  To understand how the steps in.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
Evaluation design and implementation Puja Myles
1 Tempus Tempus Workshop Sarajevo 7 June 2006 « Good practice in Preparing an Application » Anne Collette European Training Foundation Tempus Department.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
TYNE AND WEAR FIRE AND RESCUE SERVICE ‘Creating The Safest Community’ Evaluation in the Fire and Rescue Service Vicki Parnaby.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Logic Models How to Integrate Data Collection into your Everyday Work.
Designing Effective Evaluation Strategies for Outreach Programs
Exercise Module 3b Cost-Benefit-Analysis (CBA)
Strategic Planning for Learning Organizations
4.00 Understand promotion and intermediate uses of marketing-information Understand marketing-research activities to show command of their nature.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Warren Leon Clean Energy States Alliance 1

2 You will be connected to audio using your computer’s microphone and speakers (VoIP) or through your headset. If you are calling into the webinar, please select “Use Telephone” on the webinar’s control panel after joining the Webinar. Make sure to enter your Audio PIN, shown in the control panel where you choose the option to join by telephone. Dial: 1 (805) Access code: Webinar ID: Audio PIN: Shown after joining the meeting You will automatically be muted when you join the webinar. We’ll take questions after all the presentations have been made; you can queue up a question anytime for the organizer to ask at the end of the session by typing it in the Question box of the control panel. We can also unmute you to ask your question if you raise your “hand.” This webinar is being recorded and will be made available after the call on the CESA website at in the Members section of the website.

 No monthly call in August  MOUs for have been mailed. Please contact Maria or Anne with any questions at  We will be sending out a preliminary agenda and pre-registration for the October members meeting in Washington, DC, shortly. Please pre-register ASAP! 3

 Explain what is in the report  Make it easier for you to use the report  Cover some of the recommendations in the report  Answer your questions and/or discuss whatever you are interested in 4

 Considers evaluation from the perspective of the program manager  Seeks to help you identify evaluation activities that will be useful, cost-effective, and well- received by program staff, policymakers, and stakeholders  Considers how to choose among and approach different types of evaluations  Serve as a reference guide: designed to make it easy to find the sections relevant to you 5

 How to prepare for effective evaluation ◦ Setting program goals ◦ Producing a program theory ◦ Setting evaluation goals ◦ Choosing an evaluator  Discussion of five types of evaluation, with recommendations for how to approach each  Framing and presenting evaluations for external audiences 6

 Developing a program theory ◦ From California Evaluation Framework  Sample program logic model ◦ From NYSERDA  Possible evaluation questions ◦ From EERE Guide to Managing General Evaluation Studies  Descriptions of models used in cost-benefit evaluations: IMPLAN, JEDI, REMI  Reference works  Representative evaluation reports 7

My Starting Point 1.Improve the quality and efficiency of programs 2.Put a program into context by helping managers & stake-holders understand what it is accomplish- ing and how it compares to other programs 3.Demonstrate that the agency takes its responsibilities seriously and seeks to maximize the public benefits of public spending 8

 Evaluation doesn’t receive enough attention  Few widely accepted protocols for evaluating renewable energy programs  Difficult to evaluate the results of some programs  Program goals may not be explicit or fully thought out  The decision about what to evaluate is not made by program managers 9

“Not everything that counts can be counted, and not everything that can be counted counts.” Albert Einstein 10

1. Needs and Market Assessments 2. Process Evaluations 3. Outcome Evaluations ◦ Some evaluators use “gross outcomes” 4. Impact Evaluations ◦ Some evaluators use “net outcomes” 5. Cost-Benefit Evaluations 11

 What It Does ◦ Identifies target markets ◦ Identifies barriers to the adoption of renewable energy ◦ Understand a market or audience  Why It Is Used ◦ Help program managers design programs ◦ Establish baselines for measuring future progress  Key Comments ◦ They can lead to better programs ◦ They can create baselines for future evaluations of program outcomes and impacts ◦ Clean energy agencies should do them more frequently  12

 What It Does ◦ Examines program implementation processes and operations ◦ Determines whether the program is well-designed, efficiently managed, and effectively marketed ◦ Assess customer satisfaction  Why It Is Used ◦ Identify ways to improve the program ◦ Understand the views of customers and other stakeholders 13

 They come in many shapes and sizes  Renewable energy agencies should do them more frequently— especially small, focused evaluation reports  Customer satisfaction surveys are especially valuable 14

 What It Does ◦ Determines whether the program is achieving its intended outcomes and objectives  Why It Is Used ◦ Keep program managers and others focused on the program’s goals ◦ Know whether a program is achieving its objectives ◦ Determine whether the program should be modified so that it is better achieving its objectives 15

 It is important to know if a program is achieving its goals  Depending upon the program and its intended outcomes, an outcome evaluation can be relatively straight-forward or very complicated  Consider up front whether the dissemination of research findings can help key stake- holders and will move a program towards its goals 16

1. Energy outcomes 2. Environmental outcomes 3. Electricity system outcomes 4. Economic outcomes  EPA report: Assessing the Multiple Benefits of Clean Energy: A Resource for States  For renewable energy installation programs: easier to quantify their energy & environmental outcomes than their electric system & economic outcomes 17

 Market transformation programs  Business development programs  Research, R&D, and demonstration programs  Education and information programs 18

 What It Does ◦ Determines the share of the outcomes caused by the program rather than other factors ◦ Identifies unintended but valuable benefits of the program  Why It Is Used ◦ Understand what the program is actually causing to happen ◦ Determine whether the program is unnecessarily providing funding to free riders who do not need the program to act  19

 The more complicated the program theory and the more multi-faceted the program’s route to achieving outcomes, the harder it is to determine which outcomes would have occurred without the program ◦ Consider how difficult it will be to determine your program’s impacts and how precise an answer you need  It can be helpful to gather the results from evaluations of similar programs in other states ◦ If the evaluation yields results significantly different than other states, have the evaluator provide an explanation  Choose experienced evaluators  Piggy-back other questions onto a survey 20

 What It Does ◦ Compares the economic and/or other benefits of a program’s impacts to the cost of achieving those impacts.  Why It Is Used ◦ Determine the extent to which the program’s benefits outweigh its costs ◦ Understand whether the program is cost-effective ◦ Decide whether the program should be continued as is, modified, or ended 21

 Proceed cautiously  These evaluations usually involve predictions about the future, introducing considerable uncertainty  Consider indirect costs and benefits  Choose experienced evaluators & understand their methodology  Make sure the presentation of results reveals rather than obscures the assumptions and uncertainties Consider low-cost alternatives Commentary by an economist Comparisons between states 22

1. Think carefully about why you want to undertake the evaluation 2. Choose an experienced evaluator (including ones with pre-existing data & models) 3. Make sure you understand the research methods 4. Have the evaluator justify key assumptions 5. Assess more than one scenario 6. Understand the discount rate 7. Ask the evaluator to examine comparable studies & justify differences 8. Understand exactly what the results show 9. Make them comparative 10. Don’t imply the results are more accurate than they are 23

1. Identify all the audiences for the study 2. Make sure the evaluators understand the audiences 3. Develop an outreach plan 4. Decide whether the evaluators should produce collateral material 5. Don’t present the results with inappropriate precision 6. Decide on responses to the report and what to share 7. Disseminate relevant information to stakeholders 24

Warren Leon Senior Advisor Clean Energy States Alliance 25