USDOE SEP/ARRA Evaluation Webinar Guidelines for States Conducting or Contracting Evaluations of ARRA Funded SEP Activities June 16, 2010 June 17, 2010.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
Development of an Illinois Energy Efficiency Policy Manual A Proposed Process and Approval Plan Presented by: Karen Lusson Illinois Attorney General’s.
Commercial in confidence Regional Collaboration – Fleet Management Project Progress Summary Duncan Kinnaird Project Lead.
1 Department of State Program Evaluation Policy Overview Spring 2013.
Donald T. Simeon Caribbean Health Research Council
Copyright © 2012 Pearson Education, Inc. Publishing as Prentice Hall 3.1.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Standards for Qualitative Research in Education
Return On Investment Integrated Monitoring and Evaluation Framework.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Chapter 28 Design of Experiments (DOE). Objectives Define basic design of experiments (DOE) terminology. Apply DOE principles. Plan, organize, and evaluate.
Quality evaluation and improvement for Internal Audit
Measuring the effectiveness of government IT systems Current ANAO initiatives to enhance IT Audit integration and support in delivering Audit outcomes.
Purpose of the Standards
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Performance Management Training Governor’s Office of Planning and Budget 1.
Web Analytics and Social Media Monitoring CAM Diploma in Digital Marketing (Metrics and Analytics) Assignment Briefing December 2013 & March 2014 Papers.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Culture Programme - Selection procedure Katharina Riediger Infoday Praha 10/06/2010.
Chapter 3: Marketing Intelligence Copyright © 2010 Pearson Education Canada1.
RESEARCH A systematic quest for undiscovered truth A way of thinking
1 Framework Programme 7 Guide for Applicants
Marketing Research: Overview
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Developing a result-oriented Operational Plan Training
Chapter 6 : Software Metrics
1 PRESENTATION TO THE PORTFOLIO COMMITTEE ON PUBLIC SERVICE AND ADMINISTRATION 1 June 2005 INVESTIGATION INTO THE RE- EMPLOYMENT OF PERSONS RETIRED DUE.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Descriptive and Causal Research Designs
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Risk Management Project Management Digital Media Department Unit Credit Value : 4 Essential Learning time : 120 hours.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Evaluating a Research Report
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Today: Our process Assignment 3 Q&A Concept of Control Reading: Framework for Hybrid Experiments Sampling If time, get a start on True Experiments: Single-Factor.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
BPA M&V Protocols Overview of BPA M&V Protocols and Relationship to RTF Guidelines for Savings and Standard Savings Estimation Protocols.
RTF Custom Protocols: Background, Issues and Critical Elements February 8, 2012 Regional Technical Forum Subcommittee on Impact Evaluation and Custom Protocol.
Circuit Rider Training Program (CRTP) Circuit Rider Professional Association Annual General Meeting and Conference August 30, 2012.
Subcommittee on Design New Strategies for Cost Estimating Research on Cost Estimating and Management NCHRP Project 8-49 Annual Meeting Orlando, Florida.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
+ Websites California Measurement Advisory Council Collaborative for Energy Efficiency 1 Kentucky PSC 9/11/09 Schiller Consulting, Inc.
Regional Technical Forum Automated Conservation Voltage Reduction Protocol.
European Conference on Quality in Official Statistics Session 26: Census 2011 « Helsinki, 6 May 2010 « Census quality control with BSC: the Portuguese.
1© 2010 by Nelson Education Ltd. Chapter Five Training Design.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
WERST – Methodology Group
Comparison of CA Evaluation Protocols, CA Framework, IPMVP and CPUC Policy Manual* A preface to group discussion *In terms of how they define.
1 Louisiana: Our Energy Future 2009 American Recovery & Reinvestment Act.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
The Use of Actuaries as Part of a Supervisory Model Michael Hafeman – Consultant World Bank May 2004.
California Department of Public Health / 1 CALIFORNIA DEPARTMENT OF PUBLIC HEALTH Standards and Guidelines for Healthcare Surge during Emergencies How.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Internal Audit Quality Assessment Guide
Improved socio-economic services for a more social microfinance.
Evaluation What is evaluation?
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
What is a grant? A direct financial contribution – donation – from EU budget An action - contributing to EU policy achievement Functioning of a body acting.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
6 Chapter Training Evaluation.
KEY INITIATIVE Financial Data and Analytics
Dairy Subgroup #1: Fostering Markets for Non-Digester Projects
EM&V Planning and EM&V Issues
Presentation transcript:

USDOE SEP/ARRA Evaluation Webinar Guidelines for States Conducting or Contracting Evaluations of ARRA Funded SEP Activities June 16, 2010 June 17, 2010 Nick Hall TecMarket Works On Behalf of USDOE – EERE – ORNL

Webinar Procedures Note: you can use telephone or web audio as noted on your dashboard. Type in “chat” questions as you have them. Questions that are important to a slide or concept will be addressed as we go. Most questions will be addressed at the end of the webinar. If you are having problems with your webinar, call Wayne at (888)

Purpose of Webinar Provide guidance to states on how to conduct or contract their own evaluations of ARRA-funded SEP programmatic activities. This session will: 1. Review the EERE SEP State Evaluation Guidelines and discuss how to apply the evaluation standards presented within those Guidelines. 2. Briefly describe the upcoming National SEP-ARRA Evaluation and explain how it relates to and interacts with state conducted evaluations. 3. Address questions states have on conducting their own evaluations.

Requirement for Evaluation 1. States are strongly encouraged, although not required, to conduct evaluations of their SEP/ARRA programs. 2. It is recommended that state evaluations follow SEP Evaluation Guidelines.

SEP ARRA Evaluation Guidelines The Guidelines consist of two types of standards. Most evaluation professionals are already familiar with these or other similar standards. These include: 1. Administrative and Management Standards Cover conditions of management, operations and structure of the evaluation effort. 2. Technical Evaluation Standards Cover technical approaches and issues within the evaluation effort, and include: A. General design and objectivity standards, B. Study design and application standards.

1. Administrative and Management Standards Standard 1: Evaluation Metrics State-implemented evaluations should focus on quantifying 4 key metrics: 1. Energy and demand savings, 2. Renewable energy capacity installed and generation achieved, 3. Carbon emission reductions (in metric tons), 4. Job creation (number, type, duration). States can include other metrics but should reliably quantify the initiative’s effects within these 4 key metrics.

1. Administrative and Management Standards Standard 2: Independent Evaluations Evaluations should be conducted by independent evaluation professionals who: have no financial or management interests in the initiatives being evaluated, and do not benefit, or appear to benefit, from the study’s findings. State managers and administrators should have no influence on the findings of the study (similar to an external review or audit). Findings are those of the evaluation professionals conducting the study.

1. Administrative and Management Standards Standard 3: Attribution of Effects Effects are net effects, above and beyond what would have been achieved without the SEP/ARRA funds. Effects reported should be those resulting from the SEP ARRA efforts and not include the effects of other non-SEP/ARRA funded initiatives or activities. When effects are the result of jointly funded initiatives, effects can be allocated to SEP/ARRA in proportion to the percentage of those funds in relation to total initiative or project funding. Other, more complex, methods can also be used to allocate effects.

1. Administrative and Management Standards Standard 4: Evaluation Budgeting Typical evaluation budgets require the allocation of between 2% and 5% of the program/project/activity budget. EERE recommends state evaluation budgets be set at 5% or less of their SEP/ARRA funding levels. SEP/ARRA evaluation funds should focus on SEP/ARRA programs, projects, activities and not used to evaluate non-SEP/ARRA initiatives.

1. Administrative and Management Standards Standard 5: Timing of the Evaluations Evaluation planning should begin at the same time as the project activities are initiated if possible Baseline evaluation approaches, data collection and analysis efforts should be established early in the planning process. Evaluations should provide results to program managers as early as possible, while still providing necessary rigor and reliability. To the extent possible, initial or early study results should be reported within 12 months of the start of the evaluation.

2. Technical Standards General Design and Objectivity Standard 1: Study Design Development of the evaluation approach should be independent of the project administrators and implementers. Independent evaluator should work with administrators and implementers to understand operational processes and establish reliable and cost conscious approaches.

2. Technical Standards General Design and Objectivity Standard 2: Study Rigor and Reliability Approach should be as rigorous as possible. The results should be as reliable as possible within the study approach and budget limits. Sample protocols for reliable studies can be found... USDOE Model Energy Efficiency Program Impact Evaluation Guide, Nov USDOE Impact Evaluation Framework for Technology Deployment Programs, July 2007, California Evaluation Protocols, April 2006, See Evaluation Guidelines for additional protocol references

2. Technical Standards General Design and Objectivity Standard 3: Threats to Validity The independent evaluator should address and report the various threats to validity for the: a) Study design, b) Analysis approach. Study design should minimize threats to validity and reduce the levels of uncertainty. Evaluation plan and final report should discuss threats to validity and describe how they were minimized in the evaluation methods and analysis approach.

2. Technical Standards General Design and Objectivity Standard 4: Alternative Hypotheses Study design should be developed to address, or rule out, alternative hypotheses regarding how observed effects may have occurred.

2. Technical Standards General Design and Objectivity Standard 5: Ability to Replicate The description of the methodological approach should be reported in enough detail so that the study can be replicated by another equally qualified evaluation professional and the reliably of the results can be understood.

2. Technical Standards General Design and Objectivity Standard 6: State of the Art Analysis Evaluation approach should use current state-of-the art evaluation approaches and analysis methods. Approach should maximize the use of technical evaluation advancements and most current analytical approaches.

2. Technical Standards General Design and Objectivity Standard 7: Unbiased Assessments Objective and unbiased approaches should be used for evaluation design, data collection, analysis and reporting of results. Unsubstantiated claims and unsupported conclusions or personal points of view should be excluded.

2. Technical Standards General Design and Objectivity Standard 8: Attribution of Effects This standard supports the management and administrative standard discussed above. i.e. the technical standards also focus on measuring and reporting the net effects from the SEP/ARRA funding. The technical approach should be designed to assess and report SEP/ARRA effects rather than the effects of other market intervention strategies or other non- SEP/ARRA events-conditions.

2. Technical Standards General Design and Objectivity Standard 9: Conflict of Interest Evaluators must disclose any real or perceived conflicts of interest that they might have.

2. Technical Standards Study Design and Application Standards Standard 1: Evaluation Expertise The states should employ and the evaluations be led by skilled, experience, trained evaluation professionals. Evaluation professionals should be experts within the area of research associated with the study being conducted.

2. Technical Standards Study Design and Application Standards Standard 2: Study Plan Each study should have a detailed study plan specifying: study methods and approach, tasks to be conducted, detailed data collection approach, detailed analysis approach for each key metric: a. energy and demand saved (source BTUs) b. renewable energy generated c. carbon emission reductions (metric tons), d. jobs created.

2. Technical Standards Study Design and Application Standards Standard 3: Study Report Executive Summary should contain a. net energy impacts each year over EUL of actions. b. renewable energy capacity installed and annual energy generated and projected to be generated for each year over the EUL of the installed capacity. c. Net metric tons of carbon not released into the atmosphere over the EUL of the projects implemented. d. Number and type of short term and long term full time and part time jobs generated e. The results of the SEP-Recovery Act cost effectiveness test

2. Technical Standards Study Design and Application Standards Standard 3: Study Report (continued) The report should include the results and confidence limits for the following metrics: a. energy and demand saved (source BTUs), b. renewable energy generated, c. carbon emission reductions (metric tons), d. jobs created.

2. Technical Standards Study Design and Application Standards Standard 4: Sampling Sampling approaches should use procedures that minimize sampling bias and maximize representativeness of the impacted population. For example: a) Simple random, b) Stratified random, c) Probability proportional to size. Sampling should be no less rigorous than 90% level of precision with a confidence limit of +/- 10% for the key attributes on which the sample is selected.

2. Technical Standards Study Design and Application Standards Standard 5: IPMVP Field Efforts To the extent possible within the evaluation budget analytical approach, baseline and post-install operations assessments should use one of the 4 primary IPMVP field data collection definitional frameworks needed to assess effects. IPMVP requires field measurements for a sample of key performance indicators that influence effects estimates.

2. Technical Standards Study Design and Application Standards Standard 5: IPMVP Field Efforts (continued) Option Definitions A-D: A. Partially Measured Retrofit Isolation B. Retrofit Isolation C. Whole Building D. Calibrated Simulation

2. Technical Standards Study Design and Application Standards Standard 6: Surveys and Interviews Surveys and interviews should be professionally conducted and employ objective, unbiased, non- leading questions. Close-ended, scaled or quantitative response questions should be single subject questions that allow for complete responses. Complex questions, or questions that require a preamble to set the stage for a question should be avoided.

2. Technical Standards Study Design and Application Standards Standard 7: Cost Effectiveness Test The evaluation should use the SEP-Recovery Act cost effectiveness test to assess cost effectiveness of the SEP/ARRA initiatives and portfolio. SEP-Recovery Act = source BTUs saved/generated per year per $1,000 of total funds spent. Programs and portfolios are considered cost effective if they achieve greater than 10 million source BTUs per year, over the EUL of actions taken, per $1,000 of total funding. No other cost effectiveness test applies to the SEP/ARRA funding.

The National SEP/ARRA Evaluation Oak Ridge National Laboratory will lead and direct a national SEP/ARRA evaluation, independent from the states. The evaluation will select a sample of state-level programmatic activities within 16 broad programmatic areas to be evaluated. The sample will be selected to be representative of the national portfolio of state initiatives. It is anticipated that from 100 to 120 state initiatives will be sampled for the national evaluation. A nationally recognized evaluation expert will provide technical advice to ORNL. ORNL will contract the evaluation tasks to a team of professional evaluation experts with extensive experience across the range of evaluation tasks to be completed.

The National SEP/ARRA Evaluation (continued) ORNL will assess the rigor and reliability of the state implemented evaluation efforts and include the results of those studies in the national assessment, where possible. A number of conditions will impact this decision, including: a) Inclusion of state-studied activities in sample selected for national evaluation, b) Rigor and reliability of the study approach, c) Timing of the state study and the report delivery, d) Need for additional data to reliably estimate effects across all key SEP programmatic areas.

Data Needed For Evaluation In order for both state-implemented evaluations and the national evaluation to be successful it is important that relevant information be collected and maintained by the states. Examples of the information needed include: a) Contact information of people served/impacted: name, company, address of contact, phone, . b) Detailed description of services received: Address of actions taken Recommendations from audits Measures taken Installation dates Etc. c) Anticipated results of the services received (changes expected as a result), For a detailed list of the types of data needed see: California Evaluation Protocols, April 2006, page

Questions and Responses Questions not addressed in the presentation and discussion can be posed at this time.