Download presentation
Presentation is loading. Please wait.
Published byEaster Johnston Modified over 6 years ago
1
Research, Evaluation, and Performance Measurement
Jackie Berger Atlantic City Electric Energy Assistance Summit August 25, 2016
2
APPRISE Background Nonprofit Research Institute
Founded in 2002 Princeton, NJ Research conducted across the U.S. Nonprofit Research Institute Low-Income Energy Bill Payment Assistance Low-Income Energy Efficiency Residential Energy Efficiency Commercial and Industrial Energy Efficiency Energy Program Research and Evaluation Federal government State government offices Utility companies Nonprofits Our Clients 2
3
NJ Low-Income Energy Programs
Federal Block Grant Low-Income Home Energy Assistance Program (LIHEAP) Weatherization Assistance Program (WAP) State Programs NJ Universal Service Fund Program (USF) NJ Comfort Partners 3
4
APPRISE NJ Experience Energy Payment Assistance
NJ SHARES Annual Evaluations NJ Universal Service Program 2005 NJ LIHEAP and USF 2011 Low-Income Energy Efficiency NJ Comfort Partners 2002 NJ WAP 2004 NJ Comfort Partners Seniors Pilot 2005 NJ Comfort Partners 2013 Other Energy Efficiency NJ Residential New Construction Baseline 2001 NJ Energy Star Homes 2009 NJ Clean Energy Economy 2014 NJNG SAVEGREEN 2015 SJG Energy Efficiency 2016 4
5
Presentation Outline Research Evaluation Performance Measurement WHAT?
Background Information Understand Need and Context for Programs Example: NJ Needs Assessment Evaluation Program Process and Impact Document Impacts and Assess How to Improve Program Example: BGE Pilot Payment Program Performance Measurement Program Performance Assess Opportunity for Improvement and Measure Improvement Over Time Example: NJ SHARES WHAT? WHY? HOW? 5
6
Evaluation and Performance Measurement Comparison
Periodic In-Depth External Performance Measurement Ongoing Developmental Internal 6
7
Evaluation What are the program goals?
How is my program performing compared to goals or expectations? How does it compare to other programs? How can the program improve? 7
8
Performance Measurement
How can I measure? My organization’s efforts and inputs Outcomes of those efforts How we impacted clients How we impacted the utility How has this changed over time? How does my program/organization compare? What are higher performers doing? Are those designs/actions related to results? Can I implement those designs/actions? 8
9
research 9
10
American Community Survey Data
Represents NJ in 2014 Number low-income under various definitions Household characteristics Energy bills Energy burden 10
11
NJ Program Eligibility
11
12
Percent Eligible 12
13
Number Eligible 13
14
Main Heating Fuel 14
15
Home Ownership 15
16
Language Spoken 16
17
Electric Bills Non-Electric Heaters
17
18
Electric Bills Electric Heaters
18
19
Energy Burden Gas Heaters
19
20
Energy Burden Electric Heaters
20
21
evaluation 21
22
Why Evaluate? “Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” ― H. James Harrington 22
23
Why Evaluate Measure Program Impacts Assess Potential Improvements
Meet Regulatory Requirements 23
24
Existing limited-income discount program
BGE Pilot Motivation Incentive for on-time bill payment But only 27% receive credit for timely bill payment Existing limited-income discount program Attempt to cost-effectively increase on-time payment Test different programs and benefits Determine impacts on payments and usage Determine cost-effectiveness New pilots 24
25
Pilot Programs CAMP 1-Double bill credit
2-Existing credits and payment counseling 3-Double bill credit and payment counseling GRAD 1-Graduated credits 2-Graduated credits and Quick Home Energy Check-up 3-Graduated credits and payment counseling 25
26
CAMP Credits Poverty Level Monthly CAMP Credit Historical Pilot ≤75%
$12 $24 76% - 110% $9 $18 111% - 150% $7 $14 151% - 175% $5 $10 Subsidized Housing 26
27
Monthly Usage (Therms)
GRAD Credits Monthly Usage (kWh) Discount or Credit Monthly Usage (Therms) ≤500 40% ≤40 30% 41-60 751-1,000 20% 61-80 1,001-1,500 10% 81-120 >1,500 $15 credit >120 $10 credit 27
28
CAMP Pilot Credits # Mean Credits Number Total Average Credit All CAMP
824 4.5 $51 $11 1 – Double Credits 291 $59 $13 2 – Payment Counseling 233 4.7 $32 $7 3 – Credits & Counseling 300 4.3 $58 28
29
GRAD Pilot Credits # Mean Credits Number Total Average Credit ALL GRAD
822 8.5 $239 $28 1 – Graduated Discount 304 8.7 $243 2 – Discount & Audit 261 8.2 $228 3 – Discount & Counseling 257 $244 29
30
Bill Payment Since you have been participating in this program, would you say you have been paying your BGE bill on time more often, you have been paying your BGE bill on-time less often, or there has been no change in when you pay your BGE bill? CAMP GRAD ALL 1 2 3 Double Credit Payment Counseling Credit & Counseling Discount Discount & Audit Discount & Counseling More Often 44% 48% 40% 49% Less Often 6% 7% 4% 8% 5% No Change 50% 45% 56% 46% 47% 30
31
Program Participation
Has your participation in the program over the past year led to your participation in other energy programs such as: the Weatherization Assistance Program, BGE’s Limited Income Energy Efficiency Program, or any other energy program? CAMP GRAD ALL 1 2 3 Double Credit Payment Counseling Credit & Counseling Discount Discount & Audit Discount & Counseling WAP 24% 23% 26% 28% 21% 43% LIEEP 20% 16% 27% 31% 11% Other Energy Program 37% 19% 31
32
CAMP Arrearage Impacts
Treatment Group Net Change # Pre Post Gross Change All CAMP 566 $197 $230 $33** -$60** 1 – Double Credits 204 $181 $202 $21 -$72** 2 – Payment Counseling 160 $187 $216 $29 -$64 3 – Credits & Counseling 202 $220 $268 $48* -$45 32
33
GRAD Arrearage Impacts
Treatment Group Net Change # Pre Post Gross Change ALL GRAD 561 $276 $251 -$26** -$119** 1 – Graduated Discount 213 $257 $254 -$3 -$96** 2 – Discount &Audit 170 $332 $289 -$43* -$136** 3 – Discount & Counseling 178 $246 $210 -$36** -$129** 33
34
Recommendations Pilot Design
Self-selection: those who responded to letters enrolled – difficult to extrapolate to all customers. Stratification: done differently for CAMP, GRAD, and comparison group – difficult to estimate and compare results CAMP: # on-time payments, poverty level GRAD: # on-time payments, electric usage, arrearages Enrollee tracking: difficult to examine data attrition issues. 34
35
Recommendations Administration
Could respond to customer questions Would not have to refer to BGE Data access for DEF payment counselors Social Security recipients had trouble with payments Change bill due date to align with benefit payment at customer’s request Payment timing 35
36
Recommendations Customer Education
Customers were unaware of program elements Many communication opportunities Invitation letter Phone enrollment Confirmation letter Shorten and simplify written communication 36
37
Recommendations Implementation
Stratification – represent all customers to be targeted by full scale implementation Customer targeting – target those likely to have beneficial outcome Program potential – examine potential cost savings against potential costs Cost effectiveness – structure payment to be no less than what was paid prior to program 37
38
Performance measurement
38
39
Types of Measures More difficult to obtain data
Inputs Outputs Outcomes Impacts Staff Hours Equipment Supplies # Applied # Enrolled % Vulnerable $ in Benefits # Referred Bill Reduction Burden Reduction % Paid Bill % Terminated More powerful information 39
40
NJ SHARES Input Example 250 agencies deliver grants
300 sites where clients can apply 269 events to raise awareness Individual Contributions $198,185 Corporate Contributions $461,361 Fundraising $1,993,948 40
41
NJ SHARES Output Example NJ SHARES serves needy households
Children under the age of six: 20% Single parent households: 22% Annual income below $50,000: 58% Have family member over 60: 22% NJ SHARES serves the working poor 82% of households have employment income 6% of households receive unemployment benefits 10% of 2013 grantees received unemployment benefits 5% received unemployment from (pre-recession) NJ SHARES provides grants to those in temporary need of assistance 77% received a grant in only one of the past 9 years Only 8% received a grant in more than two of the past 9 years In 90 days before grant, recipients averaged 2.2 payments and $434 in payments 41
42
NJ SHARES Grant Guidelines - Maximum Grant Amounts
2005 Electric Only $250 $300 $500 Gas Only $700 Electric & Gas $1,000 $1,200 Electric Heat Oil/Propane -- 42
43
Output Example % Receiving Max Grant
Not updated 43
44
Outcome Example Grant Coverage By Grant Type
Q1 and Q Recipients Electric Only Gas Only Electric & Gas Electric Heat Number of Customers 60 41 313 Mean Pre-Grant Balance $739 $878 $1,421 $1,440 Mean Grant $429 $632 $929 $645 Mean Post-Grant Balance $310 $246 $491 $795 Mean Percent of Pre-Grant Balances Covered 78% 77% 85% 69% 44
45
Maximum Grant Assessment
63% of electric-only 2014 grantees received the maximum amount Compared to 84% in both 2012 and 2013 78% of pre-grant balances were covered by electric-only grants Compared to 58% in 2012 and 70% in 2013 Increase in electric-only grant amount from $300 to $500 was effective 73% of electric heat recipients received the maximum of $700 Electric heat grants cover 69% of pre-grant balances Compared to 78% for electric-only grants, 77% for gas-only grants, and 85% for electric and gas grants Consider increase in maximum electric heat grant? 45
46
Impact Example: Segmentation Analysis
Successful (38%) Marginal Success (5%) Need More Help (57%) 46
47
Payment Compliance Analysis Segmentation Analysis
Year After Grant Receipt Q1 2006 Q1 2007 Q1 2008 Q1 2009 Q1 2010 Q1 2011 Q1 2012 Q1 2013 Q1 2014 Q1 & Q2 2014 Successful 26% 24% 19% 32% 49% 29% 39% 38% Marginal Success 7% 6% 5% 4% Need More Help 67% 70% 76% 61% 62% 44% 69% 66% 57% TOTAL 100% 47
48
Impact Example Segmentation Analysis
Successful (65%) Marginal Success (8%) Need More Help (27%) 48
49
Impact Example Segmentation Analysis
Q Recipients Q Recipients Q Recipients Q Recipients Q1 & Q Recipients Year After Grant Receipt First Year After Grant Receipt First Second Successful 49% 50% 26% 53% 29% 67% 39% 38% Marginal Success 7% 12% 5% 10% 8% 4% Need More Help 44% 37% 69% 66% 25% 57% Accounts Included 1,429 1,089 672 569 497 318 152 316 49
50
Impact Example Segmentation Analysis
Q1 & Q Recipients Balance Increased by $100 - $399 Balance Increased by $400 - $999 Balance Increased by $1,000 + Number of Customers 67 75 37 Percent of Customers 21% 24% 12% Mean Charges $1,873 $2,526 $4,052 Mean Payments $1,627 $1,865 $2,213 50
51
Impact Example Segmentation Analysis of Elderly Households
Q1 & Q Recipients Elderly Only Non-Elderly Only Difference Number of Customers 48 268 -- Percent of Customers 15% 85% Pre-Grant Balance $1,225 $1,284 -$29 Grant Amount $732 $815 -$83* Post-Grant Balance $523 $469 $54 # % Success 26 54% 94 35% 19%** Marginal Success 0% 17 6% -6%* Needs More Help 22 46% 157 59% -13% ** Statistically significant at the 95% level * Statistically significant at the 90% level 51
52
Data sources 52
53
Agency Records Most accessible Should be put in a database
May not be needed if good program database Data Customers served Characteristics – income, poverty level, elderly, children Services provided 53
54
Public Use Data Available for free download
Characterize eligible population in service territory Programming skills needed Number eligible Geography Characteristics – income, poverty level, elderly, children, language Energy costs Data 54
55
Customer Survey Real time feedback Requires staff time
Document methodology Data Customer characteristics Satisfaction Self-reported impacts 55
56
Program Database Program manager – state or utility Canned reports
Queries Data Customers served Characteristics – income, poverty level, elderly, children Services provided 56
57
Utility Data Difficult to obtain Easier for utility managed program
Requires software and programming skills Customer type – heating, water heating, baseload Energy usage Energy bills Customer payments Energy assistance Collections actions Data 57
58
Performance Measurement Process
Start with available data Identify performance measures Determine additional data sources Collect additional data Develop additional performance measures 58
59
Performance Measurement Repeat
Compare Results Over Time Assess What is Working Refine Program 59
60
Performance Measurement
Summary Research, evaluation, and performance measurement serve important purposes Understand program and population served Research Assess what is working and why Evaluation Measure performance over time Performance Measurement 60
61
President and Co-Founder
Contact Jackie Berger, Ph.D. President and Co-Founder APPRISE 32 Nassau Street, Suite 200 Princeton, NJ 08542 61
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.