2013 California Statewide Critical Peak Pricing Evaluation Josh L. Bode Candice A. Churchwell DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco,

Slides:



Advertisements
Similar presentations
Demand Response Forecasting, Measurement & Verification Methods/Challenges/Considerations National Town Meeting on Demand Response Washington, DC - June.
Advertisements

Load Impact Estimation for Demand Response Resources Nicole Hopper, Evaluation Manager July 14, 2009 National Town Meeting on Demand Response and Smart.
BG&E’s PeakRewards SM Demand Response Program Successful Approaches for Engaging Customers August 20, 2014.
Time-of-Use and Critical Peak Pricing
1 The Potential For Implementing Demand Response Programs In Illinois Rick Voytas Manager, Corporate Analysis Ameren Services May 12, 2006.
A Two-Level Electricity Demand Model Hausman, Kinnucan, and Mcfadden.
Authors: J.A. Hausman, M. Kinnucan, and D. McFadden Presented by: Jared Hayden.
2013 Statewide BIP Load Impact Evaluation Candice Churchwell DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 7, 2014.
Automated Demand Response Pilot 2005/2004 Load Impact Results and Recommendations Final Report © 2005 Rocky Mountain Institute (RMI) Research & Consulting.
DISPUTES & INVESTIGATIONS ECONOMICS FINANCIAL ADVISORY MANAGEMENT CONSULTING ©2014 Navigant Consulting, Inc. May 7, 2014 Navigant Reference: Impact.
2013 SDG&E Summer Saver Load Impact Evaluation Dr. Stephen George DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 7, 2014.
Empirical Data on Settlement of Weather Sensitive Loads Josh Bode, M.P.P. ERCOT Demand Side Working Group Austin, TX September 20, 2012.
Energy and Environmental Economics 1 Avoided Cost and E3 Calculator Workshops Energy and Environmental Economics, Inc. October 3, 2005.
1 SMUD’s Small Business Summer Solutions Pilot: Behavioral response of small commercial customers to DR programs (with PCTs) Karen Herter, Ph.D. Associate.
California Energy Commission Resource Adequacy Demand Forecast Coincidence Adjustments R Resource Adequacy Workshop January.
Copyrighted © 2000 PG&E All Rights Reserved CASE Initiative Project TDV Economic Update Brian Horii and Snuller Price Energy & Environmental Economics,
February 23, 2006Karen Herter, LBL/CEC/UCB-ERG 1 /29 Temperature Effects on Residential Electric Price Response Karen Herter February 23, 2006.
1 PG&E’s Operating Experience with TVP Rates Best Practices and Lessons Learned in Time-Variant Pricing R Residential Rate Workshop Gregory B.
Resource Adequacy Forecast Adjustment(s) Allocation Methodology
2011 Long-Term Load Forecast Review ERCOT Calvin Opheim June 17, 2011.
Overview – Non-coincident Peak Demand
Presentation Overview
California Statewide Pricing Pilot Lessons Learned Roger Levy Demand Response Research Center NARUC Joint Meeting Committee on Energy.
Overview of Residential Pricing/Advanced Metering Pilots Charles Goldman Lawrence Berkeley National Laboratory SMPPI Board Meeting August 3, 2005.
Overview of the 2009 LIEE Impact Evaluation Workshop 1: “Overview of Lessons Learned” October 17, 2011.
Electric / Gas / Water Eric Fox Oleg Moskatov Itron, Inc. April 17, 2008 VELCO Long-Term Demand Forecast Methodology Overview.
Big Sandy Rural Electric Cooperative Corporation 2006 Load Forecast Prepared by: East Kentucky Power Cooperative, Inc. Forecasting and Market Analysis.
Farmers Rural Electric Cooperative Corporation 2006 Load Forecast Prepared by: East Kentucky Power Cooperative, Inc. Forecasting and Market Analysis Department.
SM SOUTHERN CALIFORNIA EDISON® Page 1 Discussion on CEC’s and SCE’s Forecast Differences Presentation at CEC Preliminary Demand Forecast Workshop July.
Measurement, Verification, and Forecasting Protocols for Demand Response Resources: Chuck Goldman Lawrence Berkeley National Laboratory.
Baseline Analysis CBP, AMP, and DBP Steve Braithwait, Dan Hansen, and Dave Armstrong Christensen Associates Energy Consulting DRMEC Spring Workshop May.
Why Normal Matters AEIC Load Research Workshop Why Normal Matters By Tim Hennessy RLW Analytics, Inc. April 12, 2005.
California SONGS\OTC Plants Assumptions TEPPC – Data Work Group Call Tuesday, September 15, 2015.
1 CPUC Avoided Cost Workshop Introduction and Overview.
Grayson Rural Electric Cooperative Corporation 2006 Load Forecast Prepared by: East Kentucky Power Cooperative, Inc. Forecasting and Market Analysis Department.
Avoided Cost and E3 Calculator Workshops Energy and Environmental Economics, Inc. October 4, 2005.
Demand Response and the California Information Display Pilot 2005 AEIC Load Research Conference Myrtle Beach, South Carolina July 11, 2005 Mark S. Martinez,
Linking the Wholesale and Retail Markets through Dynamic Retail Pricing Presented by: Henry Yoshimura Manager, Demand Response ISO New England September.
CPUC Workshop on Best Practices & Lessons Learned in Time Variant Pricing TVP Pilot Design and Load Impact M&V Dr. Stephen George Senior Vice President.
CPUC Workshop on Best Practices & Lessons Learned in Time Variant Pricing TVP Load & Bill Impacts, Role of Technology & Operational Consideration Dr. Stephen.
EFFECTS OF HOUSEHOLD LIFE CYCLE CHANGES ON TRAVEL BEHAVIOR EVIDENCE FROM MICHIGAN STATEWIDE HOUSEHOLD TRAVEL SURVEYS 13th TRB National Transportation Planning.
March 1, 2011 Load Analysis Update Calvin Opheim Manager, Load Forecasting and Analysis.
California’s Proposed DR Cost-Effectiveness Framework January 30, 2008.
Settlement Accuracy Analysis Prepared by ERCOT Load Profiling.
2016 Long-Term Load Forecast
California’s Statewide Pricing Pilot Summer 2003 Impact Evaluation 17 th Annual Western Conference, San Diego, California Ahmad Faruqui and Stephen S.
Blue Grass Energy Cooperative Corporation 2006 Load Forecast Prepared by: East Kentucky Power Cooperative, Inc. Forecasting and Market Analysis Department.
ERCOT PUBLIC 10/7/ Load Forecasting Process Review Calvin Opheim Generation Adequacy Task Force October 7, 2013.
Licking Valley Rural Electric Cooperative Corporation 2006 Load Forecast Prepared by : East Kentucky Power Cooperative, Inc. Forecasting and Market Analysis.
The Impact of Retail Rate Structure on the Economics of Commercial Photovoltaic Systems in California Ryan Wiser, Andrew Mills, Galen Barbose & William.
An Overview of Demand Response in California July 2011.
Building Blocks for Premise Level Load Reduction Estimates ERCOT Loads in SCED v2 Subgroup July 21, 2014.
Overview Review results Statewide Pricing Pilot Review results Anaheim Rebate Pilot Compare performance of models used to estimate demand response peak.
2015 California Statewide Critical Peak Pricing Evaluation DRMEC Spring 2016 Load Impact Evaluation Workshop San Francisco, California May, 2016 Prepared.
2013 Load Impact Evaluation Capacity Bidding Program (CBP) Steve Braithwait, Dan Hansen, and Dave Armstrong Christensen Associates Energy Consulting DRMEC.
DRMEC Spring 2016 Load Impacts Evaluation Workshop San Francisco, California May 10, SDG&E Summer Saver Load Impact Evaluation.
Communicating Thermostats for Residential Time-of-Use Rates: They Do Make a Difference Presented at ACEEE Summer Study 2008.
Matinee Pricing Opt-in Pilot Rate Proposal (R ) Commission Workshop February 24, 2016 Manager, Pricing Design.
2015 SDG&E PTR/SCTD Evaluation DRMEC Spring 2016 Load Impact Workshop George Jiang May 11 th, 2016 Customer Category Mean Active Participants Mean Reference.
2013 Load Impact Evaluation of Southern California Edison’s Peak Time Rebate Program Josh Schellenberg DRMEC Spring 2014 Load Impact Evaluation Workshop.
2015 Load Impact Evaluation of Southern California Edison’s Peak Time Rebate Program May 11, 2016 Prepared by: Eric Bell Jenny Gai.
Analysis of Load Reductions Associated with 4-CP Transmission Charges in ERCOT Carl L Raish Principal Load Profiling and Modeling Demand Side Working Group.
Customer Specific Regression Overview DRMEC Spring 2016 Evaluation and Enrollment Workshop – Session 3 Kelly Marrin, Director, Applied Energy Group.
California Product Offerings
Emergency Response Service Baselines
California Energy Demand Electricity Forecast (CED 2014) Update: Method and Summary of Results November 5, 2014 Chris Kavalec Demand Analysis.
Py2015 California statewide on-bill finance
Independent Load Forecast Workshop
Resource Adequacy Demand Forecast Coincidence Adjustments
Behavior Modification Report with Peak Reduction Component
Presentation transcript:

2013 California Statewide Critical Peak Pricing Evaluation Josh L. Bode Candice A. Churchwell DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 2014

 Introduction and comparison of rates  PG&E Results  SCE Results  SDG&E Results Appendix: Evaluation methodology 1 Presentation overview

 PG&E called 8 events  SCE called 10 events  SDG&E called 4 events  Each utility calls event days based on their system conditions  SDG&E’s events last from 11 AM to 6 PM while PG&E’s and SCE’s last from 2 to 6 PM 2 Event days are different across the three utilities  System load patterns across utilities are not always coincidental, particularly for Northern and Southern California  Comparisons in impacts between the utilities should be made with caution  No event day was common to all utilities.

 PG&E’s average load reduction per customer was 8.6% (22.4 kW).  SCE’s average load reduction per customer was 5.8% (14.2 kW).  SDGE’s average load reduction per customer was 6.9% (18.4 kW). 3 Average event day percent reductions by utility are in a similar range

reductions were slightly larger than last year’s UtilityYear Number of Events Accts Temp (°F) Reference Load (MW) Load Impact (MW) Percent Impact (%) PG&E , % , % , % , % SCE , % , % , % , % SDG&E , % , % , % , %  PG&E and SDG&E average event conditions were hotter  PG&E showed the largest jump in reductions from 30 to 38 MW, due to both more participants and larger percent reductions  The customer mix has evolved substantially over time but response has been consistent  Even when enrollments seem similar, customers exit and join, leading to changes

PG&E Specific Results 5

6 PG&E’s average load reduction was 8.6%, or 38.4 MW, across the 8 events in 2013

7 PG&E detailed event load impacts * Unofficial event ** Avg. event estimates do not include the unofficial event

 Manufacturing; Wholesale & Transport; and Agriculture, Mining, & Construction accounted for 41% of the load and over 75% of impacts  While the Offices, Hotels, Finance, Services sector had the most load, 36%, it accounted for only 16% of program impacts 8 PG&E’s demand reductions were concentrated in two industries

 On a percentage basis, reductions were similar for large and smaller customers alike 9 Not surprisingly, the largest customers account for a large share of the demand reductions Avg.

10 Ex ante estimates relied on available historical data 1. Model loads absent DR as a function of temperature and month 2. Estimate loads absent DR for 1-in-2 and 1-in-10 conditions 5. Combine loads and percent reductions 3. Model historical percent impacts as a function of weather 4. Estimate percent impacts under 1-in-2 and 1-in-10 conditions  There is no robust empirical data about how medium customers will respond when default to CPP  Percent impacts were based on the historical 2012–2013 industry specific percent load reductions as a function of weather.  Medium reference loads were developed by using a representative sample of customers and estimated by LCA and industry.  The industry specific percent load reductions were then applied to medium customer loads.

11 Ex ante percent reductions are inline with historical percent reductions  2012 and 2013 events used as basis for ex ante  Estimates were produced by LCA  Comparison is based on:  Historical customers  Same event window and historical events  Assumes percent reductions of new customers in LCA will be similar

12 Reference loads align with historical loads  References loads were separately estimated for large and medium customers by LCA  Comparison is based on:  Historical customers  Same event window and historical events  New large customers are assumed to be similar to old ones

Weather Year Year AccountsReference Loads (MW)Percent Reductions Aggregate Impacts (MW) 2012 Estimates 2013 Estimates 2012 Estimates 2013 Estimates 2012 Estimates 2013 Estimates 2012 Load Impact (MW) 2013 Load Impact (MW) 1-in ,7961, %7.7% ,8152, %8.4% ,8152, %8.3% ,8152, %8.3% in ,7961, %7.0% ,8152, %7.6% ,8152, %7.5% ,8152, %7.5% Comparison of 2013 ex ante year estimates to prior year estimates  Differences are mostly due to changes in the enrollment forecasts  2014 ex ante impacts align well with the 2013 avg. event response  Over time, customers who reduce demand have tended to remain on CPP, while those less likely to respond have migrated elsewhere

Weather Year Year Enrolled Accounts Avg. Reference Load (MW 1 to 6 PM) Avg. Estimated Load w/ DR (MW 1 to 6 PM) Avg. Load impact (MW 1 to 6 PM) % Load Reduction (%) Weighted Temp. (°F) 1-in-10 August System Peak Day % % , % , % , % , % , % in-2 August System Peak Day % % , % , % , % , % , % Due to limited empirical data, ex ante estimates for medium customers have a higher degree of uncertainty

SCE Specific Results 15

16 SCE’s average load reduction was 5.8%, or 35.5 MW, across the 10 event days in Jul-Sep 2013

17 SCE detailed event load impacts

 Manufacturing accounts for roughly 27% of customers and load but provides 68% of the reductions  Wholesale and transport accounts for 17% of the customers and load but provides 12% of the reductions 18 Two industry groups account for 87% of the demand reductions

19 At SCE, larger customers not only have more load but also deliver larger percent reductions Average

20 Ex ante percent reductions line up with historical percent reductions  2012 and 2013 events used as basis for ex ante  Estimates were produced by area  Comparison is based on:  Historical customers  Same event window and historical events  Assumes percent reductions of new customers by area will be similar

21 Comparison of 2013 ex ante year estimates to prior year estimates  2013 estimates reflect the evolution of SCE’s default CPP customers and more recent historical data  2014 ex ante impacts align well with the 2013 avg. event response, 35.5 MW  Over time, customers who reduce demand have tended to remain on CPP, while those less likely to respond have migrated elsewhere Weather Year Year 2012 Reference Load 2013 Reference Load 2012 Percent Load Impact 2013 Percent Load Impact 2012 Accounts 2013 Accounts 2012 Load Impact (MW) 2013 Load Impact (MW) 1-in %5.7%3,0992, in %5.7%3,1302, in %5.7%3,1412, in %5.6%3,0992, in %5.6%3,1302, in %5.6%3,1412,

SDG&E Specific Results 22

23 SDG&E’s average weekday load impact was 6.9%, 19.6 MW

24 SDG&E detailed event load impacts

 Offices made up 46% of the load at SDG&E (versus 36% and 23% at PG&E and SCE). They also reduced demand more.  Institutional and Wholesale & Transport segments still performed the best and accounted for a substantial share of impacts. 25 SDG&E had more customers and load in the offices sector

26 At SDG&E, larger customers also accounted for a large amount of the demand reductions Average  On a percentage basis, there is no discernable pattern of responsiveness by size

27 Ex ante percent reductions are inline with 2012–2013 historical percent reductions  2012 and 2013 events used as basis for ex ante  Comparison is based on:  Historical customers  Same event window and historical events

28 Ex ante reference loads align with historical loads

29 Comparison of 2013 ex ante year estimates to prior year estimates  Differences are due to small changes in the estimated percent reductions  The 2013 estimates incorporate more historical data (2012 and 2013 events v alone) Weather Year Year 2012 Reference Load 2013 Reference Load 2012 Percent Load Impact 2013 Percent Load Impact 2012 Accounts 2013 Accounts 2012 Load Impact (MW) 2013 Load Impact (MW) 1-in %6.7%1,0971, %6.7%1,1141, %6.7%1,1281, in %6.2%1,0971, %6.2%1,1141, %6.2%1,1281,

30 Due to limited empirical data, ex ante estimates for medium customers have a higher degree of uncertainty Weather YearYear Enrolled Accounts Avg. Reference Load Avg. Estimated Load w/ DR Avg. Load Impact % Load Reduction Weighte d Temp. (11 AM - 6 PM MW) (%) (°F) 1-in-10 August System Peak Day , % , % , % in-2 August System Peak Day , % , % , %81.9  Impacts are based on large default CPP, but adjusted for differences in the industry mix and size of medium customers

For comments or questions, contact: Josh L. Bode, M.P.P. Candice A. Churchwell, M.S. Nexant, Inc. 101 Montgomery St., 15 th Floor San Francisco, CA

Appendix Evaluation Methodology and Validation 32

 CPP rates introduce two changes:  Higher prices on peaks hours of critical days (CPP adder) designed to encourage customers to reduce demand  Rate discounts during non-event days to offset CPP adder  The impact of the rate discount on non-event days is not estimated for three reasons:  Focus for planning and operations is on the dispatchable demand reductions that can be attained  The pre-enrollment data needed to quantify non-event day impacts is too distant (four or five years prior)  Most non-event day impacts, if any, are now embedded in system load forecasts (and not incremental)  Analyses in 2010 and 2011 did not find statistically significant impacts due to the rate discount 33 The focus on the evaluation was on the dispatchable event day response

 For industrial customers (and commercial customers without a successful match), impacts were estimated using customer-specific regressions.  Electricity usage patterns on non- event days are used to estimate what customers would have done if an event had not been called (a within- subjects method).  Approach works:  For very large customers (where a valid control group may not be possible)  Customers with low weather sensitivity  When non-event day conditions are similar to event days (often not the case)  For commercial customers, the estimates rely on difference-in- differences panel regressions.  Observe how the control and participant groups behave during both event and non-event days.  Method is less likely to be an artifact of the model selected. It better captures behavior during event days without comparable weather conditions.  Approach works best with:  Ample control group candidates  There are many observable variables  Non-event or pre-enrollment data  A small number of customer does not dominate the load and/or reductions  These use an external control group and non-event day data.  This approach was used for weather-sensitive commercial customers: institutional/governmental industries, offices, hotels, finance, services, and retail stores.  This approach was used for less weather-sensitive industrial customers, and for those commercial customers that could not be matched with a suitable control customer. The ex post evaluation used the best available method for commercial and industrial customers 34

 PG&E example using raw aggregated data for summer weekdays without any modeling  Some of the noise is explained through day of week, seasonal effects and customer specific weather  Generally customer specific regression are better suited for industrial customers Individual Regressions – some 2013 events lacked comparable non-event days 35

36 Difference-in-differences  Difference-in-differences uses information from a control group and information for hot non-event days  Hourly loads for a well-matched control group nearly mirror the loads of the CPP population on event-like days.  These small differences are subtracted from the difference between control and CPP population loads on actual event days – the difference-in-differences.

37 The non-event control days were selected to match event conditions as closely as possible  We matched non-event days to historical events based on system loads, temperature, day of week and program year.  Comparable proxy days are not available for some days with very extreme weather.

38 The validation tests show that the hybrid method out- performs other alternatives  Impacts are estimated for the proxy event days, using the same models and process used for the ex post evaluation. If a method is accurate, it produces impact estimates for the average event that center on zero and are insignificant because, in fact, there is no event.  The impacts when CPP event day prices were not in effect are near zero and the reference loads estimated via the control group match the CPP participant loads.