Download presentation
Presentation is loading. Please wait.
Published byEvelyn Horton Modified over 8 years ago
1
TIM MELLOCH JANUARY 2016 Subcommittee Report Out: Best Available Data and Measure Complexity
2
Presentation Overview January 28, 2016 2016 Subcommittee Report Out Subcommittee Reports Out On: TPP4: Using Best Available Data to Determine Deemed Savings: Best Practices, Recommendations TPP5: Issues and Recommendations for Reducing Measure Complexity Next Steps 1. Jan Cal TF approval 2. Seek staff input on documents 3. Use as guidance for future measure development 2
3
Best Available Data Subcommittee January 28, 2016 2016 Subcommittee Report Out 3 Overview Approach Observations High-Level Recommendations
4
“Best Available Data” for Deemed Savings: Overview January 28, 2016 2016 Subcommittee Report Out Current direction from the Commission is to reference DEER for “best available data” when developing non-DEER workpapers or performing other ex-ante review Commission decisions also seem to allow CPUC Staff guidance and flexibility when interpreting their direction including their guidance to: “[Balance] the need for accurate ex ante values with the equally important need to continuously augment the portfolios with new technologies that offer promise…We also encourage Commission Staff not to allow “the perfect to be the enemy of the good,” in general but especially in determining the ex ante values for new technologies that offer considerable promise…” and….. “utilities are expected to use DEER assumptions etc unless the Commission Staff agrees with their proposal for such replacements” 4
5
“Best Available Data” for Deemed Savings: Overview January 28, 2016 2016 Subcommittee Report Out Establishing a clear definition of “best available data” will result in the following benefits: Streamline Process- Eliminate time wasted applying different interpretations or attempting to achieve a higher level of certainty than is appropriate Increase Certainty- Create a greater level of certainty of ex-ante values when calculating measure and portfolio performance to limit retroactive adjustments Increase Confidence in and Rigor of Values- If standards that are used to develop deemed values are clear and transparent, this will increase confidence in and rigor of the values (while acknowledging the existence and magnitude of error/uncertainty inherent in any method of measure estimate) 5
6
Best Available Data: Approach January 28, 2016 2016 Subcommittee Report Out 6 The phrase “best available data” lacks a clear definition Different interpretations, delays in workpaper approval Subcommittee working to help define and recommend consistent approaches for this requirement—sources being leveraged include: Technical Reference Manuals (TRMs) from jurisdictions across the United States The Uniform Methods Project (UMP) Draft Evaluation, Measurement and Verification (EM&V) Guidelines for the Clean Power Plan (CPP)
7
January 28, 2016 2016 Subcommittee Report Out Cal TF Staff gathered and reviewed TRMs from across the country Key principles from the TRM review fall into three main categories: Substantive Guidelines: Diligent review/approval process for third party data Higher weight given to empirical data Consider appropriateness of data for implementation approach/population served Consider source of data – possibility of bias? When and where data gathered (local or regional sources preferred) Identify rigor and statistical significance required for studies Process: TRM is developed through a public, collaborative process allowing for various parties to comment, provide information/studies, resolve misunderstandings and differences; unresolved items tracked through “Comparison Exhibit”. Regulatory staff participation key; Regulatory Commission ultimate decision-maker. Form: TRM is well documented, publicly accessible and savings values are reproducible Source information is readily available “Best Available Data” for Deemed Savings: Observations: TRM Best Practices 7
8
January 28, 2016 2016 Subcommittee Report Out This initiative is funded by the US Department of Energy to develop measurement and verification protocols for determining energy savings for commonly implemented program measures. Many of the specific goals they outline could also apply to the “Best Available Data” project including: Guidelines that can help strengthen the credibility of energy efficiency program savings calculations Provide clear, accessible step-by-step protocols to determine savings for the most common energy efficiency measures Support consistency and transparency in how savings are calculated Allow for comparison of savings across similar efficiency programs and measures in different jurisdictions Increase the acceptance of reported energy savings by financial and regulatory communities “Best Available Data” for Deemed Savings: Observations: Uniform Methods Project (UMP) 8
9
January 28, 2016 2016 Subcommittee Report Out Use actual data and widely accepted methods Consider how measures are being implemented Measure and its purpose Applicability of data to the situation under which the measure is being implemented Not all measures should be deemed! Relatively simple, well-defined efficiency measures where there is little uncertainty in average savings Measure should be clearly defined, well-documented and public Final emission guidelines include a number of safeguards and quality-control features that are intended to ensure the accuracy and reliability of the claimed EE savings. Measures should be developed through joint and collaborative research EUL values should be updated every five years “Best Available Data” for Deemed Savings: Observations: Clean Power Plan (Draft Guidelines) 9
10
January 28, 2016 2016 Subcommittee Report Out “Best practices” for ensuring data quality and rigor for deemed values, or “Best Available Data”: Substantive: High-level Guidelines One size does not fit all Some “expert judgement needed” - but document when used and provide clear rationale for expert judgment. Process: Public and collaborative peer review process Regulatory staff must actively participate! Regulatory agency final decision-maker! Form Well- documented, Transparent Original sources available “Best Available Data” for Deemed Savings: Observations/Recommendations 10
11
January 28, 2016 2016 Subcommittee Report Out The paper concludes by providing recommendations addressing the following areas: Define Best Available Data Establish Criteria Establish Process Establish Approach Establish DEER Exceptions Establish Criteria for Studies Establish Criteria for “More Data Needed” Establish Process for “Interim Workpaper Approval” Establish “Dispute Resolution Process” Establish Plan for “Real Data” “Best Available Data” for Deemed Savings: Cal TF Proposal 11
12
January 28, 2016 2016 Subcommittee Report Out Determining “best available data” requires balance and should not unduly impede innovation CPUC policy is that the “perfect should not be the enemy to the good” – new measures should be introduced even when the data is not perfect. If existing data is not sufficient to establish “reasonable expected values”, there needs to be a plan to gather more data through implementation or early EM&V “Best Available Data” for Deemed Savings: In Conclusion 12
13
Measures Complexity Subcommittee January 28, 2016 2016 Subcommittee Report Out 13 Background/Overview Current Process Measure Complexity: Approach to Simplifying
14
Measures Complexity: Background/Overview January 28, 2016 2016 Subcommittee Report Out 14 Measure Complexity in the context used in TPP5 generally refers to: How many different “measure combinations” should be developed for a measure to account for differences in how a measure is deployed, where it will be installed, and how it will be used The engineering approach used to generate savings estimates The application of additional factors such as HVAC “interactive effects” In seeking to populate DEER with the best available information, the CPUC has recognized that DOE2.2 modeling is an appropriate source for weather-sensitive estimates. The development of multiple measure combinations through energy modeling and application of multipliers (such as interactive effects) is intended to provide accurate savings estimates
15
Measures Complexity: Current State January 28, 2016 2016 Subcommittee Report Out 15 Over 600,000 “Measure Combinations” in DEER. Measures combinations vary based on climate, building vintage, building type, utility (different interactive effects applied by utility) More measure combinations does not necessarily equal more precision, but leads to: Higher costs to develop and maintain measures Difficulty QA/QC’ing, managing and using data
16
Measures Complexity: Current State January 28, 2016 2016 Subcommittee Report Out 16 Why is simplifying measure complexity important? Recent Example: 2013 Title 24 update to building standards, and 2014 updates to supporting weather data, all IOU direct & indirect climate sensitive measures required revisions. The vast majority of IOU measures are considered weather sensitive due to the application of interactive factors for products operating in conditioned building spaces The cost and time to update these workpapers by all IOUs was extensive, while most measure values changed by less than 5% Resulting difference in measure estimates may not be statistically valid due to measure uncertainty of greater than 5%
17
January 28, 2016 2016 Subcommittee Report Out Step 1: Establish Measure Definitions, Measure Impact (normal, low, high), Initial Measure Cost-Effectiveness and whether should be considered for “Interim Measure Status” (insufficient information for full measure) Step 2: Define Appropriate Measure Calculation Approach Engineering equation with documented inputs Engineering equation or curve fit with calibrated results, with statistically- justified inputs Building model Calibrated building model Evaluation results 17 Measure Complexity: Process for Simplifying
18
January 28, 2016 2016 Subcommittee Report Out Step 3: Assess Key (most impactful) Inputs; Appropriate Number and Types of Measure Combinations Unnecessary measure complexity can be identified with high-speed, high volume parametric analysis - can be used to determine what measure values are different enough to warrant a distinct measure combination. If measure values for two measure combinations differ by less than 10%, consider combining. Additional consideration may be appropriate for high impact measures. Consider whether distinctions to be made between measure combinations are even possible during implementation For example, should measure combinations be developed for midstream program if don’t know what building types/vintages measures will go in? 18 Measure Complexity: Process for Simplifying
19
January 28, 2016 2016 Subcommittee Report Out Step 4: Consider Applicability and Validity of Available Data Criteria to consider include: Applicability of sample data to the anticipated program participation population Data collection and analysis methods Study age Measurement error Sample size and confidence interval If existing data insufficient to make meaningful distinctions between measure combinations, consider collecting additional data while using simplified measure in interim. 19 Measure Complexity: Process for Simplifying
20
January 28, 2016 2016 Subcommittee Report Out 20 Step 5: Consider Measure “Administrative” Issues Cost of workpaper development, both initial and maintenance Cost of processing measure data internally for reporting purposes Risk of human error due to number of measure combinations Infrastructure required for developing, tracking, and reporting Program and customer impacts and data collection requirements due to multiple measure combinations Step 6: Apply Professional Judgment Consider using professional judgement to further simplify measure, where appropriate, but document when professional judgment used to explain basis for measure simplification
21
Measure Complexity: Process for Simplifying January 28, 2016 2016 Subcommittee Report Out 21 Step 7:Other Issues to Address Guidelines for running parametric analysis Calibration guidelines Appropriate application of interactive effects Definition/consideration of bias Best practices for measure documentation Early retirement Add on measures Baseline determination and justification
22
Measure Complexity: Process for Simplifying January 28, 2016 2016 Subcommittee Report Out 22 Request of Cal TF, General Approval: Are these documents ready to be shared with CPUC Staff for comment? Are you comfortable with “test driving” these guidelines for future measure development in 2016? Guidelines would be re-visited in 2017 to assess how well they worked
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.