Agenda Applying the Tailored Design Method in a Randomized Control Trial Experiment Survey Benjamin Messer Research Into Action, Portland, OR PAPOR Annual.

Slides:



Advertisements
Similar presentations
Survey Methodology Nonresponse EPID 626 Lecture 6.
Advertisements

Self-Administered Surveys: Mail Survey Methods ChihChien Chen Lauren Teffeau Week 10.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
1 Incorporating Statistical Process Control and Statistical Quality Control Techniques into a Quality Assurance Program Robyn Sirkis U.S. Census Bureau.
Automated Demand Response Pilot 2005/2004 Load Impact Results and Recommendations Final Report © 2005 Rocky Mountain Institute (RMI) Research & Consulting.
1 Sampling Telephone Numbers and Adults, and Interview Length, and Weighting in the California Health Survey Cell Phone Pilot Study J. Michael Brick, Westat.
ESS Mixed mode experiment ESRA Conference, Ljubljana, July 2013 Alun 18 July 2013.
2013 SDG&E Summer Saver Load Impact Evaluation Dr. Stephen George DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 7, 2014.
February 23, 2006Karen Herter, LBL/CEC/UCB-ERG 1 /29 Temperature Effects on Residential Electric Price Response Karen Herter February 23, 2006.
California Statewide Pricing Pilot Lessons Learned Roger Levy Demand Response Research Center NARUC Joint Meeting Committee on Energy.
Benjamin L. Messer Washington State University PAPOR Mini-Conference, Berkeley, CA June 24, 2011 Item Nonresponse in Mail, Web, and Mixed Mode Surveys:
KY Module 2 Household Travel Surveys Chapter 6 of TS Manual.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
1 Low Income Energy Efficiency Workforce Education & Training Project Workshop 5: Workforce Education & Training October 31, 2011 San Francisco.
1 Renee M. Gindi NCHS Federal Conference on Statistical Methodology Statistical Policy Seminar December 4, 2012 Responsive Design on the National Health.
EvergreenEcon.com ESA 2011 Impact Evaluation Draft Report Public Workshop #2 August 7, 2013 Presented By: Steve Grover, President.
Utilities’ Update on Energy Savings Assistance Program Studies Ordered in D LIOB Meeting February 26, 2014 San Francisco, California.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 18 Inference for Counts.
CPUC Workshop on Best Practices & Lessons Learned in Time Variant Pricing TVP Pilot Design and Load Impact M&V Dr. Stephen George Senior Vice President.
CPUC Workshop on Best Practices & Lessons Learned in Time Variant Pricing TVP Load & Bill Impacts, Role of Technology & Operational Consideration Dr. Stephen.
Testing for Coverage Bias when Combining Directory-Listed And Cellphone Samples T. M. Guterbock, A. Diop, J. M. Ellis, J. L. P. Holmes and K. T. Le, Center.
1 California Solar Initiative Low Income Multifamily Program Public Workshop March 17, 2008 San Francisco, CA.
California’s Statewide Pricing Pilot Summer 2003 Impact Evaluation 17 th Annual Western Conference, San Diego, California Ahmad Faruqui and Stephen S.
Utilities’ Update on Energy Savings Assistance Program Studies Ordered in D LIOB Meeting August 21, 2013 Sacramento, California.
Non-response Bias Analysis and Evaluation Reporting Passport Demand Forecasting Study 11/14/2013.
Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.
Summary of BGE’s Pilot of Innovative Direct Mail Campaign January 27, 2012.
Influencing Mode Choice in a Multi-Mode Survey May 2012 AAPOR Conference Presentation Geraldine Mooney Cheryl De Saw Xiaojing Lin Andrew Hurwitz Flora.
DRAFT January 2015 Prepared by: A ndrew C hang & C ompany, LLC CRDP Phase 2 Survey Results DISCLAIMER: This data is representative of the survey respondents.
T Relationships do matter: Understanding how nurse-physician relationships can impact patient care outcomes Sandra L. Siedlecki PhD RN CNS.
Multifamily Working Group Announcement Webinar
Candace Pang1 and Elizabeth Price2; Mentors: Dr. Chien-fei Chen3, Dr
MobiLiteracy Uganda (MLIT Uganda)
Generation of Domestic Electricity Load Profiles
SCE “To-Code” Pilot Lessons Learned
Michael E. Levin, Benjamin Pierce & Michael Twohig
Mesfin S. Mulatu, Ph.D., M.P.H. The MayaTech Corporation
Context for the experiment?
HTW Berlin University of Applied Sciences
2007 Household Travel Survey
How to Conduct Toileting Trials: A Webinar Course Evaluation
EE Third-Party Solicitation Process Workshop Solicitation Alignment
Best Practices in Residential Energy Efficiency
Regression Analysis Module 3.
Preliminary Electricity Rate and Time of Use Rate Scenarios
Siriporn Poripussarakul, Mahidol University, Thailand
DEMAND ANALYSIS WORKING GROUP (DAWG) Demand Response Subgroup – TOU Rates October 2014 © 2011San Diego Gas & Electric Company. All copyright and trademark.
The Language of Sampling
System and Study of Patient
Workshop Presentation
Planning efficient recruitment
Py2015 California statewide on-bill finance
Tool Lending Library Program evaluation
Look who's crowding-out!
Before you begin In order to remain in good standing, every Student Chapter must submit an annual report and pay annual dues. Your faculty advisor will.
Is It Worth the Cost? The Use of a Survey Invitation Letter to Increase Response to an Survey. Brian Robertson, PhD, VP Research John Charles, MS,
Operational Agility in the American Community Survey: The Promise of Administrative Records Victoria Velkoff and Jennifer Ortman American Community Survey.
The European Statistical Training Programme (ESTP)
Before you begin In order to remain in good standing, every Student Chapter must submit an annual report and pay annual dues. Your faculty advisor will.
Improving and Using Family Survey Data
Resource Adequacy Demand Forecast Coincidence Adjustments
Survey as a Measurement Tool
Principle #1 – Appropriate Product Design and Delivery This presentation is made possible by the Smart Campaign   [Introductions of facilitator(s)
11th Annual Parents, kids & money survey
How the Affordable Care Act Has Improved Americans’ Ability to Buy Health Insurance on Their Own Findings from the Commonwealth Fund Biennial Health Insurance.
Sabrina M. Figueiredo1,3, Alicia Rozensveig3, José A. Morais2, Nancy E
2019 Potential and Goals Study Workshop
Chapter 5: The analysis of nonresponse
2006 National Homeowners Insurance StudySM
Presentation transcript:

Agenda Applying the Tailored Design Method in a Randomized Control Trial Experiment Survey Benjamin Messer Research Into Action, Portland, OR PAPOR Annual Conference, San Francisco, CA December 14-15, 2017

Disclaimer The information provided and views expressed in this presentation are those of Research Into Action and do not represent the California Time of Use Working Group.

Agenda Background Study Design Response Rate Results 01 Study Design 02 Response Rate Results 03 Nonresponse, Mode Effects, and Data Quality 04 Survey Costs 05 Conclusions 06

Background

RCTs are scientifically rigorous RCT Surveys are Increasingly Used to Inform Policies, Tailored Design Method Provides Framework RCTs are scientifically rigorous Conducted with more frequency in policy areas Costs, time/duration, and complexity remain challenges Getting the policy right raises stakes, need high confidence Not a one-size-fits-all approach TDM provides framework for making decisions about each aspect of a survey

Study Design

California IOUs Tasked with RCT Electric Rate Study of Customers CA Legislature and Public Utilities Commission directed the three investor-owned utilities (IOUs) to switch to Time-Of-Use (TOU) electricity rates for residential customers IOUs: PG&E, SCE, and SDG&E (cover ~3/4s of CA) IOUs first had to conduct a pilot RCT study to determine if TOU rates work, and if the rates have harmful economic and/or health impacts on vulnerable segments of the population Compare customers on standard rate (Control) to customers on TOU rates (Treatment) Electricity prices vary on TOU rates based on the time of day (e.g. lower in morning and night, higher in afternoon and evening) vs. based on how much is used on the standard rate IOUs internally surveyed customers for Pilot recruitment, and provided bill credits for participation in the study Over 55,000 total participants

Complex RCT Study Design Climate Region Segment Control vs. Rate 1 Control vs. Rate 2 Control vs. Rate 3 Hot Non-CARE/FERA PG&E, SCE CARE/FERA Below 100% FPG PG&E SCE None 100 to 200% FPG Seniors Moderate PG&E, SCE, SDG&E Cool PG&E & SCE: 4 Rates, 9 segments, 3 regions 30 total cells SDG&E: 2 TOU Rates, 4 segments, 2 regions 12 total cells

Tailoring Survey Design to Study Design Offered web, mail, and phone modes, sequentially Designed modes to be as similar as possible Used user-friendly graphical design for web/mail modes Used client letterheads, logos, and other materials Included different framing/messaging in each contact Sent contacts and offered survey in five languages Addressed communications to account holder, but asked for person in household who makes decisions about energy usage/bills Randomized lists and randomly reversed order of scales to minimize primacy/recency effects Provided a bill credit incentive upon completion of survey Included point of contact at survey center and at client headquarters Unable to make web survey mobile-friendly

Survey Characteristics 11 research topics and 32 research questions Survey included 39 “core” questions with up to 93 items, and 14 “non-core” questions with up to 31 items Core questions included in all modes, non-core in web-only Required about 25 minutes Web Mail Phone Core Questions Non-Core Questions PG&E Questions SCE Questions SDG&E Questions PG&E SCE SDG&E

Survey Implementation If participant does NOT have email If participant has email Booklet then reminder Letter invite then reminder Call Letter invite Booklet 2 email invites Fielded survey October to mid-December 2016 During the 2016 election Performed a pre-test with 25 respondents All nonrespondents received one phone call, then used phone to target low-response segments Shout-out to WSU SESRC

Response Rate Results

Overall Response Rates are High 44,558 total respondents *Used AAPOR RR1 calculation

Response Rates are Consistent between Control and Treatment Groups Within customer segments, largest differences in RRs between Control and Rate groups is 5 percentage points Average difference is 3 percentage points

Some Variation in Response Rates Across Customer Segments, but Sufficient Statistical Power All cells larger than 200 respondents, most larger than 350

Nonresponse, Mode Effects, and Data Quality

Nonresponse Analysis Used data from the IOUs’ participant recruitment survey to compare between 2016 customer survey respondents vs. nonrespondents: Income: six categories Language preference: English vs. non-English Household size: continuous # of seniors in household: continuous Conducted logistic regression for each region/segment/rate, using ‘response to survey’ as dependent variable and p≤.05 significance level

Income, Language, and to lesser extent, HH Size and Seniors in HH are significant response predictors Lower income less likely to respond for majority of groups PG&E: 23 of 30 groups SCE: 15 of 30 groups SDG&E: 9 of 12 groups Non-English speakers were less likely to respond for some groups PG&E: 15 of 30 groups SCE: 10 of 30 groups SDG&E: 10 of 12 groups Larger households were less likely to respond for few groups PG&E: 4 of 30 groups SCE: 6 of 30 groups SDG&E: 4 of 12 groups Households with more seniors were less likely to respond for few groups PG&E: 5 of 30 groups SCE: 2 of 30 groups SDG&E: 0 of 12 groups Nonresponse did not vary across control and treatment groups

Minimal Mode Effects Web/mail respondents more likely to skip questions vs. phone respondents Phone respondents more likely to select “Don’t know” vs. web/mail respondents Did not find primacy or recency effects on Likert scale or list-based questions Similar-as-possible design across modes, randomization of lists, and randomly reversing order of scales worked

Data Quality Was High About 2% of respondents were identified as ‘satisficers’ “Straightlined” on questions with 8 or more items “Selected all” on questions with mutually exclusive answers Skipped more than 20% of items Satisficing more common among low-income and non- English speaking respondents

Survey Costs

Costs are High but Worth It for the Clients Total survey costs are ~$800,000, excluding incentives $800K / 44,558 respondents = ~$18/respondent Average incentive amount was $52 bill credit/respondent About 3% of respondents received $75 or $100 Including incentives, total costs/respondent = ~$68

Conclusions

TDM Is Useful for RCT Study Surveys TDM does not provide a step-by-step guide on how to design a survey but is a framework for making design decisions accounting for multiple factors that influence survey response We successfully used TDM to design a survey for a large-scale RCT High response rates, good data quality, and a fairly representative sample, at a cost acceptable to the clients Data was useful for making policy decisions The potential for economic and/or health hardship for CARE/FERA customers resulted in the CPUC excluding them from defaulting onto TOU rates Plan ahead (there are lots of decisions to make), try to be flexible, and closely monitor implementation

Questions?

Benjamin Messer, Ph.D. Benjamin.Messer@researchintoaction.com

Appendices