TIM MELLOCH JANUARY 2016 Subcommittee Report Out: Best Available Data and Measure Complexity.

Slides:



Advertisements
Similar presentations
New Paradigms for Measuring Savings
Advertisements

Development of an Illinois Energy Efficiency Policy Manual A Proposed Process and Approval Plan Presented by: Karen Lusson Illinois Attorney General’s.
Interim Guidance on the Application of Travel and Land Use Forecasting in NEPA Statewide Travel Demand Modeling Committee October 14, 2010.
UPDATE ON GROCERSMART 4.0 RESEARCH PLAN Regional Technical Forum May 3, 2011.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Internal Audit Documentation and Working Papers
Climate Change Committee WG1 QA/QC terminology and requirements from the IPCC Good Practice Guidance and the Guidelines for National Inventory Systems.
D EEMED M EASURE R EVIEW P ROJECT Final Report December 7, 2010 Regional Technical Forum Presented by: Michael Baker.
The Saga Continues: Measure Interactions for Residential HVAC and Wx measures Regional Technical Forum April 23, 2014.
Participation Requirements for a Guideline Panel PGIN Representative.
Task Force on National Greenhouse Gas Inventories Tier 3 Approaches, Complex Models or Direct Measurements, in Greenhouse Gas Inventories Report of the.
California Technical Forum (“Cal TF”) December 10, 2013 DRAFT for DISCUSSION PURPOSES.
ANNETTE BEITEL MAY 8, 2015 Ex Ante Value Development: Current Practice – Future Vision.
Purpose of the Standards
WMO UNEP INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE NATIONAL GREENHOUSE GAS INVENTORIES PROGRAMME WMO UNEP IPCC Good Practice Guidance Simon Eggleston Technical.
Environmental Impact Assessment (EIA): Overview
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Presented to: Presented by: Transportation leadership you can trust. FTC Expressway Authority Cost Savings Study Florida Transportation Commission Expressway.
Post 2012 Energy Efficiency Planning Schedule: Options and Implications February 16, am - 5 pm CPUC Auditorium.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
What is an Inventory Program for? Dr. Emilio Moceo Ph.D Director of Studies Meet international obligations and expectations Inform international, national,
Guidelines for the Development and Maintenance of RTF- Approved Measure Savings Estimates December 7, 2010 Regional Technical Forum Presented by: Michael.
Overview of the Regional Technical Forum Guidelines January 22, 2013.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
M&V Part 2: Risk Assessment & Responsibility Allocation.
S7: Audit Planning. Session Objectives To explain the need for planning To explain the need for planning To outline the essential elements of planning.
ALEJANDRA MEJIA AUGUST 6, 2015 Initial Best Practice Findings from TRM Research.
ANNETTE BEITEL JUNE, 2014 California Technical Forum Overview June California Technical Forum Overview.
DRA Perspective on What Recycled Water Applications from Investor- Owned Utilities Should Contain and How They Should be Evaluated CPUC Water Recycling.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
Audit Planning. Session Objectives To explain the need for planning To outline the essential elements of planning process To finalise the audit approach.
Update: Grocery Refrigeration Provisional Standard Protocol for Site Specific Savings RTF Meeting June 28,
SPS policy – Information Presentation Presentation to ROS June 16, 2004.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Division Of Early Warning And Assessment MODULE 5: PEER REVIEW.
Recommended Draft Policy ARIN Out of Region Use.
BPA M&V Protocols Overview of BPA M&V Protocols and Relationship to RTF Guidelines for Savings and Standard Savings Estimation Protocols.
Strengthening Science Supporting Fishery Management  Standards for Best Available Science  Implementation of OMB’s Peer Review Bulletin  Separation.
Subcommittee on Design New Strategies for Cost Estimating Research on Cost Estimating and Management NCHRP Project 8-49 Annual Meeting Orlando, Florida.
Overview of a Water Action Plan: California Public Utilities Commission Paul G. Townsley, President Arizona American Water January 18, 2011.
Discussion of Unpaid Claim Estimate Standard  Raji Bhagavatula  Mary Frances Miller  Jason Russ November 13, 2006 CAS Annual Meeting San Francisco,
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Is a Single Force the Only Option? Colin Mair, Chief Executive, Improvement Service.
Integration Issues for RTF Guidelines: Savings, Lifetimes and Cost/Benefit October 24, 2012 Regional Technical Forum Presented by: Michael Baker, SBW.
ALEJANDRA MEJIA JULY 2 ND, 2015 History and Current State of DEER.
Regional Technical Forum Automated Conservation Voltage Reduction Protocol.
Presentation to Energy Optimization Collaborative October 2015 ENERGY A Proposal to Expand the Calibration Research Agenda: Part Two.
Evaluation of Wood Smoke Quantification and Attribution RTF PAC October 17, 2014.
Guidelines Revisions Defining What RTF Means by “Savings” December 17,
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Comparison of CA Evaluation Protocols, CA Framework, IPMVP and CPUC Policy Manual* A preface to group discussion *In terms of how they define.
Electronic TRM Proposal STAKEHOLDER PRESENTATIONS FALL 2015.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Slide 1 B O N N E V I L L E P O W E R A D M I N I S T R A T I O N Presented by: Todd Amundson, BPA Jane Peters, research into action Ryan Fedie, BPA Update.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
ALEJANDRA MEJIA JULY 2, 2015 Commission Guidance on DEER and Ex Ante Development.
How The Regional Technical Forum Supports PNW Energy Efficiency Programs January 5, 2010.
Guidelines Overview Michael Baker January 20, 2016.
Building Valid, Credible & Appropriately Detailed Simulation Models
 Ensure utilities plan for and provide services by which Missouri’s residents and businesses can achieve their goals with less energy over time, with.
STAKEHOLDER ENGAGEMENT PROCESS FOR ENERGY EFFICIENCY BUSINESS PLAN DEVELOPMENT March, 2016.
© 2005 San Diego Gas and Electric Company and Southern California Gas Company. All copyright and trademark rights reserved Water Energy Nexus Calculator.
ANNETTE BEITEL AND TIM MELLOCH JULY 27, 2016 eTRM Implementation Update and Review of Draft RFQ/Data Spec.
Track 2 Working Group 4th Meeting
Track 2 Working Group 4th Meeting
GREENHOUSE GAS EMISSIONS INVENTORY
Ex Ante Review Overview
U.S. Information Quality Standards
EM&V Planning and EM&V Issues
Presentation transcript:

TIM MELLOCH JANUARY 2016 Subcommittee Report Out: Best Available Data and Measure Complexity

Presentation Overview January 28, Subcommittee Report Out Subcommittee Reports Out On: TPP4: Using Best Available Data to Determine Deemed Savings: Best Practices, Recommendations TPP5: Issues and Recommendations for Reducing Measure Complexity Next Steps 1. Jan Cal TF approval 2. Seek staff input on documents 3. Use as guidance for future measure development 2

Best Available Data Subcommittee January 28, Subcommittee Report Out 3 Overview Approach Observations High-Level Recommendations

“Best Available Data” for Deemed Savings: Overview January 28, Subcommittee Report Out Current direction from the Commission is to reference DEER for “best available data” when developing non-DEER workpapers or performing other ex-ante review Commission decisions also seem to allow CPUC Staff guidance and flexibility when interpreting their direction including their guidance to: “[Balance] the need for accurate ex ante values with the equally important need to continuously augment the portfolios with new technologies that offer promise…We also encourage Commission Staff not to allow “the perfect to be the enemy of the good,” in general but especially in determining the ex ante values for new technologies that offer considerable promise…” and….. “utilities are expected to use DEER assumptions etc unless the Commission Staff agrees with their proposal for such replacements” 4

“Best Available Data” for Deemed Savings: Overview January 28, Subcommittee Report Out Establishing a clear definition of “best available data” will result in the following benefits: Streamline Process- Eliminate time wasted applying different interpretations or attempting to achieve a higher level of certainty than is appropriate Increase Certainty- Create a greater level of certainty of ex-ante values when calculating measure and portfolio performance to limit retroactive adjustments Increase Confidence in and Rigor of Values- If standards that are used to develop deemed values are clear and transparent, this will increase confidence in and rigor of the values (while acknowledging the existence and magnitude of error/uncertainty inherent in any method of measure estimate) 5

Best Available Data: Approach January 28, Subcommittee Report Out 6 The phrase “best available data” lacks a clear definition  Different interpretations, delays in workpaper approval Subcommittee working to help define and recommend consistent approaches for this requirement—sources being leveraged include:  Technical Reference Manuals (TRMs) from jurisdictions across the United States  The Uniform Methods Project (UMP)  Draft Evaluation, Measurement and Verification (EM&V) Guidelines for the Clean Power Plan (CPP)

January 28, Subcommittee Report Out Cal TF Staff gathered and reviewed TRMs from across the country Key principles from the TRM review fall into three main categories:  Substantive Guidelines:  Diligent review/approval process for third party data  Higher weight given to empirical data  Consider appropriateness of data for implementation approach/population served  Consider source of data – possibility of bias?  When and where data gathered (local or regional sources preferred)  Identify rigor and statistical significance required for studies  Process: TRM is developed through a public, collaborative process allowing for various parties to comment, provide information/studies, resolve misunderstandings and differences; unresolved items tracked through “Comparison Exhibit”. Regulatory staff participation key; Regulatory Commission ultimate decision-maker.  Form:  TRM is well documented, publicly accessible and savings values are reproducible  Source information is readily available “Best Available Data” for Deemed Savings: Observations: TRM Best Practices 7

January 28, Subcommittee Report Out This initiative is funded by the US Department of Energy to develop measurement and verification protocols for determining energy savings for commonly implemented program measures. Many of the specific goals they outline could also apply to the “Best Available Data” project including:  Guidelines that can help strengthen the credibility of energy efficiency program savings calculations  Provide clear, accessible step-by-step protocols to determine savings for the most common energy efficiency measures  Support consistency and transparency in how savings are calculated  Allow for comparison of savings across similar efficiency programs and measures in different jurisdictions  Increase the acceptance of reported energy savings by financial and regulatory communities “Best Available Data” for Deemed Savings: Observations: Uniform Methods Project (UMP) 8

January 28, Subcommittee Report Out Use actual data and widely accepted methods Consider how measures are being implemented  Measure and its purpose  Applicability of data to the situation under which the measure is being implemented Not all measures should be deemed!  Relatively simple, well-defined efficiency measures where there is little uncertainty in average savings Measure should be clearly defined, well-documented and public  Final emission guidelines include a number of safeguards and quality-control features that are intended to ensure the accuracy and reliability of the claimed EE savings. Measures should be developed through joint and collaborative research EUL values should be updated every five years “Best Available Data” for Deemed Savings: Observations: Clean Power Plan (Draft Guidelines) 9

January 28, Subcommittee Report Out “Best practices” for ensuring data quality and rigor for deemed values, or “Best Available Data”:  Substantive: High-level Guidelines  One size does not fit all  Some “expert judgement needed” - but document when used and provide clear rationale for expert judgment.  Process:  Public and collaborative peer review process  Regulatory staff must actively participate!  Regulatory agency final decision-maker!  Form  Well- documented, Transparent  Original sources available “Best Available Data” for Deemed Savings: Observations/Recommendations 10

January 28, Subcommittee Report Out The paper concludes by providing recommendations addressing the following areas:  Define Best Available Data  Establish Criteria  Establish Process  Establish Approach  Establish DEER Exceptions  Establish Criteria for Studies  Establish Criteria for “More Data Needed”  Establish Process for “Interim Workpaper Approval”  Establish “Dispute Resolution Process”  Establish Plan for “Real Data” “Best Available Data” for Deemed Savings: Cal TF Proposal 11

January 28, Subcommittee Report Out Determining “best available data” requires balance and should not unduly impede innovation CPUC policy is that the “perfect should not be the enemy to the good” – new measures should be introduced even when the data is not perfect. If existing data is not sufficient to establish “reasonable expected values”, there needs to be a plan to gather more data through implementation or early EM&V “Best Available Data” for Deemed Savings: In Conclusion 12

Measures Complexity Subcommittee January 28, Subcommittee Report Out 13 Background/Overview Current Process Measure Complexity: Approach to Simplifying

Measures Complexity: Background/Overview January 28, Subcommittee Report Out 14 Measure Complexity in the context used in TPP5 generally refers to:  How many different “measure combinations” should be developed for a measure to account for differences in how a measure is deployed, where it will be installed, and how it will be used  The engineering approach used to generate savings estimates  The application of additional factors such as HVAC “interactive effects” In seeking to populate DEER with the best available information, the CPUC has recognized that DOE2.2 modeling is an appropriate source for weather-sensitive estimates. The development of multiple measure combinations through energy modeling and application of multipliers (such as interactive effects) is intended to provide accurate savings estimates

Measures Complexity: Current State January 28, Subcommittee Report Out 15 Over 600,000 “Measure Combinations” in DEER. Measures combinations vary based on climate, building vintage, building type, utility (different interactive effects applied by utility) More measure combinations does not necessarily equal more precision, but leads to:  Higher costs to develop and maintain measures  Difficulty QA/QC’ing, managing and using data

Measures Complexity: Current State January 28, Subcommittee Report Out 16 Why is simplifying measure complexity important? Recent Example: 2013 Title 24 update to building standards, and 2014 updates to supporting weather data, all IOU direct & indirect climate sensitive measures required revisions. The vast majority of IOU measures are considered weather sensitive due to the application of interactive factors for products operating in conditioned building spaces The cost and time to update these workpapers by all IOUs was extensive, while most measure values changed by less than 5% Resulting difference in measure estimates may not be statistically valid due to measure uncertainty of greater than 5%

January 28, Subcommittee Report Out Step 1: Establish Measure Definitions, Measure Impact (normal, low, high), Initial Measure Cost-Effectiveness and whether should be considered for “Interim Measure Status” (insufficient information for full measure) Step 2: Define Appropriate Measure Calculation Approach  Engineering equation with documented inputs  Engineering equation or curve fit with calibrated results, with statistically- justified inputs  Building model  Calibrated building model  Evaluation results 17 Measure Complexity: Process for Simplifying

January 28, Subcommittee Report Out Step 3: Assess Key (most impactful) Inputs; Appropriate Number and Types of Measure Combinations  Unnecessary measure complexity can be identified with high-speed, high volume parametric analysis - can be used to determine what measure values are different enough to warrant a distinct measure combination.  If measure values for two measure combinations differ by less than 10%, consider combining. Additional consideration may be appropriate for high impact measures.  Consider whether distinctions to be made between measure combinations are even possible during implementation  For example, should measure combinations be developed for midstream program if don’t know what building types/vintages measures will go in? 18 Measure Complexity: Process for Simplifying

January 28, Subcommittee Report Out Step 4: Consider Applicability and Validity of Available Data  Criteria to consider include:  Applicability of sample data to the anticipated program participation population  Data collection and analysis methods  Study age  Measurement error  Sample size and confidence interval  If existing data insufficient to make meaningful distinctions between measure combinations, consider collecting additional data while using simplified measure in interim. 19 Measure Complexity: Process for Simplifying

January 28, Subcommittee Report Out 20 Step 5: Consider Measure “Administrative” Issues  Cost of workpaper development, both initial and maintenance  Cost of processing measure data internally for reporting purposes  Risk of human error due to number of measure combinations  Infrastructure required for developing, tracking, and reporting  Program and customer impacts and data collection requirements due to multiple measure combinations Step 6: Apply Professional Judgment  Consider using professional judgement to further simplify measure, where appropriate, but document when professional judgment used to explain basis for measure simplification

Measure Complexity: Process for Simplifying January 28, Subcommittee Report Out 21 Step 7:Other Issues to Address  Guidelines for running parametric analysis  Calibration guidelines  Appropriate application of interactive effects  Definition/consideration of bias  Best practices for measure documentation  Early retirement  Add on measures  Baseline determination and justification

Measure Complexity: Process for Simplifying January 28, Subcommittee Report Out 22 Request of Cal TF, General Approval: Are these documents ready to be shared with CPUC Staff for comment? Are you comfortable with “test driving” these guidelines for future measure development in 2016? Guidelines would be re-visited in 2017 to assess how well they worked