1 Unobserved common causes of measurement and nonresponse error on the 2008 ANES Panel Survey International Total Survey Error Workshop Stowe, VT - June.

Slides:



Advertisements
Similar presentations
Correcting for Common Causes of Nonresponse and Measurement Error Andy Peytchev International Total Survey Error Workshop Stowe, June 15, 2010.
Advertisements

Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Preliminary Results from the 2008 Oklahoma Health Care Insurance and Access Survey Presentation to the Oklahoma Health Care Authority Board November 13,
An Assessment of the Impact of Two Distinct Survey Design Modifications on Health Insurance Coverage Estimates in a National Health Care Survey Steven.
1 Avalaura L. Gaither and Eric C. Newburger Population Division U.S. Census Bureau Washington, D.C. June 2000 Population Division Working Paper No. 44.
Review What is a random sample? What is saliency?
Challenges to Surveys Non-response error –Falling response rates over time Sample coverage –Those without landline telephones excluded –Growing.
Volunteer Motivation and Behavior: A Regional Comparison Robert F. Ashcraft Carlton F. Yoshioka ASU Lodestar Center for Philanthropy and Nonprofit Innovation.
Thur, feb 7, 2012 using large data sets. Collecting Data in a Study sample survey: sample people from a population and interview them. example: General.
Demographic Factors, Social Supports, and Quality of Life of HIV infected Persons in Ghana Presentation by Tina Abrefa-Gyan, PhD Tina Abrefa-Gyan, PhD.
Measuring nonresponse bias in a cross-country enterprise survey
Poom Nukulkij, Joseph Hadfield and Stefan Subias Knowledge Networks, Inc. Evan Lewis Program on International Policy Attitudes (PIPA) An Investigation.
The Characteristics of Employed Female Caregivers and their Work Experience History Sheri Sharareh Craig Alfred O. Gottschalck U.S. Census Bureau Housing.
The Influence of Parent Education on Child Outcomes: The Mediating Role of Parents Beliefs and Behaviors Pamela E. Davis-Kean University of Michigan This.
The National Politics Study (NPS): Ethnic Pluralism & Politics in the 21 st Century Study Overview.
Attitudes of online panel members to mobile application based research 1 Robert Pinter WebDataNet Conference 2015 University of Salamanca
Quantitative Methods in the Social Sciences (QMSS) Lugano August 2005 Increasing response rates Ineke Stoop SCP.
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
Quantitative Research
1 Health Status and The Retirement Decision Among the Early-Retirement-Age Population Shailesh Bhandari Economist Labor Force Statistics Branch Housing.
U.S. Hispanic Population: Population Size and Composition 13.3% of the U.S. population is Hispanic. People of Mexican origin comprise 66.9% of the.
TAYLOR HOWARD The Employment Interview: A Review of Current Studies and Directions for Future Research.
The Gender Gap in Educational Attainment: Variation by Age, Race, Ethnicity, and Nativity in the United States Sarah R. Crissey, U.S. Census Bureau Nicole.
David Card, Carlos Dobkin, Nicole Maestas
5.6 Poster 2 Universal-Diverse Orientation Among First-Year College Students Lisa B. Spanierman, Ph.D., Helen A. Neville, Ph. D., Hsin-ya Liao, M.A., Ying-Fen.
A Tale of Two Methods: Comparing mail and RDD data collection for the Minnesota Adult Tobacco Survey III Wendy Hicks and David Cantor Westat Ann St. Claire,
Epistemology and Methods Survey Research & Interview Techniques May
Fourth Joint EU-OECD Workshop on Business and Consumer Opinion Surveys EU-wide business survey in the financial service sector Brussels, 12th October 2009.
A comparison of barriers to physical activity faced by older and younger adults with mobility impairments Vijay Vasudevan,
Father Involvement and Child Well-Being: 2006 Survey of Income and Program Participation (SIPP) Child Well-Being Topical Module 1 By Jane Lawler Dye Fertility.
American Pride and Social Demographics J. Milburn, L. Swartz, M. Tottil, J. Palacio, A. Qiran, V. Sriqui, J. Dorsey, J. Kim University of Maryland, College.
Fieldwork efforts  Monitoring fieldwork efforts  Monitoring fieldwork efforts: Did interviewers /survey organisations implement fieldwork guidelines.
American Pride and Social Demographics J. Milburn, L. Swartz, M. Tottil, J. Palacios, A. Qiran, V. Sriqui, J. Dorsey, J. Kim University of Maryland, College.
Recent Trends in Worker Quality: A Midwest Perspective Daniel Aaronson and Daniel Sullivan Federal Reserve Bank of Chicago November 2002.
Why are White Nursing Home Residents Twice as Likely as African Americans to Have an Advance Directive? Understanding Ethnic Differences in Advance Care.
A Latent Class Call-back Model for Survey Nonresponse Paul P. Biemer RTI International and UNC-CH Michael W. Link Centers for Disease Control and Prevention.
Parents’ basic skills and children’s test scores Augustin De Coulon, Elena Meschi and Anna Vignoles.
What Does “No Opinion” Mean in the HINTS? Michael P. Massagli, Ph.D. K. Vish Viswanath, Ph.D. Dana-Farber Cancer Institute.
Measuring Disability in school-aged children: Findings from the National Health Interview Survey PH Pastor, CA Reuben, ME Loeb NCHS Washington,
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
Handling Attrition and Non- response in the 1970 British Cohort Study Tarek Mostafa Institute of Education – University of London.
Economics and Statistics Administration U.S. CENSUS BUREAU U.S. Department of Commerce Assessing the “Year of Naturalization” Data in the American Community.
© 2007 Knowledge Networks, Inc. Presented at the 2007 American Association of Public Opinion Research Conference Mike Dennis, Senior Vice President Rick.
Workpackage 4 Public acceptance of interventions Mario Mazzocchi, Sara Capacci (University of Bologna)
Growing Challenges to State Telephone Surveys of Health Insurance Coverage: Minnesota as a Case Study Supported by a grant from the Minnesota Department.
5.6 Poster 2 Universal-Diverse Orientation Among First-Year College Students Lisa B. Spanierman, Ph.D., Helen A. Neville, Ph. D., Hsin-ya Liao, M.A., Ying-Fen.
The Challenge of Non- Response in Surveys. The Overall Response Rate The number of complete interviews divided by the number of eligible units in the.
Do Instrumental Activities of Daily Living Predict Dementia at 1- and 2- Year Follow-Up? Findings from the Development of Screening Guidelines and Diagnostic.
The State of Mentoring in Michigan Report on the Mentor Michigan Census: Wave I, Fall 2004 Robert W. Kahle, Ph.D. Kahle Research Solutions Inc. Ferndale,
Lecture 02.
How Much Choice Do Seniors Want?: Survey Results on the Medicare Prescription Drug Benefit Janet Cummings Link* Thomas Rice* Yaniv Hanoch** *Department.
TEMPLATE DESIGN © Attitudes Towards Science: Demographics, Education, or Knowledge? Christina K. Pikas,
11 How Much of Interviewer Variance is Really Nonresponse Error Variance? Brady T. West Michigan Program in Survey Methodology University of Michigan-Ann.
Does Inclusion of Both Partial and Complete Interviews from the Source Sampling Frame Have an Effect on Nonresponse Error and Measurement Error in a National.
Gateway to the Future: Improving the National Vital Statistics System St. Louis, MO June 6 th – June 10 th, 2010 Education Reporting and Classification.
Analytical Example Using NHIS Data Files John R. Pleis.
Approaches to Learning and the Acquisition of General Knowledge By Adrian Furnham, Andrew Christopher, Jeanette Garwood, and G. Neil Martin Personality.
Do Response Rates Matter in Online Panels? Representativity at Different Levels of Cumulative Response Rates Johan Martinsson University of Gothenburg.
NCRM is funded by the Economic and Social Research Council 1 Interviewers, nonresponse bias and measurement error Patrick Sturgis University of Southampton.
Democracy and Public Opinion  Core beliefs are shared  Political attitudes differ  What is public opinion?  Public opinion is critical to democracy.
Citizen Satisfaction Survey March 2003 Results Office of the Mayor Program Management Office March 28, 2003.
Dr Grant Blank Prepared for the General Online Research conference, Cologne, Germany, 5 March 2014 Who uses Twitter? Representativeness of Twitter Users.
Survey Research and Methodology UNL-GALLUP Research Center Evaluating Data quality in Time Diary Surveys Using Paradata Ana Lucía Córdova Cazar Robert.
2008 Roper Public Opinion Poll on PBS
A Comparison of Two Nonprobability Samples with Probability Samples
Demographic and Socio-Economic Profiles that Relate to Political Party Affiliation Examined in Massachusetts and Wyoming for the 2016 Presidential Election.
Chapter Eight: Quantitative Methods
Nonresponse Bias in a Nationwide Dual-Mode Survey
Chapter 7: Reducing nonresponse
The European Statistical Training Programme (ESTP)
Presentation transcript:

1 Unobserved common causes of measurement and nonresponse error on the 2008 ANES Panel Survey International Total Survey Error Workshop Stowe, VT - June 13-16, 2010 Caroline Roberts – University of Lausanne, CH Patrick Sturgis – University of Southampton, UK Nick Allum – University of Essex, UK

Overview Background and motivation Objectives of this study Data and sample Methods Level of effort & response propensity analysis Structural Equation Modeling Provisional conclusions and discussion points

Background Several theoretical models specify ways measurement error and nonresponse bias might relate (see Olson 2007) ‘Common cause’ model: variables influencing response propensity also influence response accuracy Possibility to test the model restricted by data availability Focus on individual items does not address error from suboptimal response strategies

Motivation To what extent do common causes influence both types of error? – The role of motivation and ability Our approach uses: – Panel data – to investigate a range of candidate common causes – Structural Equation Models to quantify the unobserved component Theoretical and practical interest

Objectives Two elements: 1.Comparison of data quality based on respondent ‘cooperativeness’ in the panel: do the least cooperative differ from the most? 2.Analysis of common causes of response propensity and measurement error using SEM: what is the extent and magnitude of the unobserved component?

Data ANES Internet Panel Survey Recruited by RDD telephone survey Non-internet households got MSN Web-TV 21 monthly Internet surveys: $10 each Fieldwork by Knowledge Networks Advance release data file (June 2009) include recruitment data (including CATI paradata) core profile survey plus 6 ANES waves (Jan, Feb, Jun, Sep, Oct, & Nov 2008) DeBell, Krosnick, Lupia & Roberts, 2009

Sample and Fieldwork Probability sample of US citizens aged 18+ Data from 1 of 2 recruitment cohorts 12,809 landline numbers; 2,371 completed recruitment (18.5%) 4 month fieldwork – up to 50 call attempts 2 protocol changes – Refusal conversion by NORC Internet-only recruitment for 50+ calls AAPOR1 = 26% AAPOR3 = 42% 1,738 completed recruitment + at least 1 ANES wave

Panel retention NumberPercent of total Completed interviews2, Standard telephone2, Refusal conversion853.6 Internet642.7 Completed 1+ ANES waves Profile (88.9) Wave 1 (Jan 08) (90.7) Wave 2 (Feb 08) (82.7) Wave 6 (Jun 08) (80.9) Wave 9 (Sep 08) (84.3) Wave 10 (Oct 08) (85.6) Wave 11 (Nov 08) (85.3)

Cooperativeness 3 indicators of recruitment effort: 1.Number of calls to a complete interview (1-5 vs. 6 or more) 2.Whether respondent or household member refused to participate during call attempts (refused once or more vs. never refused) 3.Respondent recruited after protocol change (by internet or refusal conversion vs. by standard telephone) Actual response propensity Differences in sample composition, responsiveness, key survey estimates, data quality

Data quality Indicators of survey satisficing (Krosnick, 1991) Item non-response (wave 1 only) Non-differentiation of items with same response scale Preference for midpoints in branched questions Item sets repeated across several ANES waves: Condition of the country (5 pts); candidate liking, attitudes to groups, policy attitudes, candidate policy positions (branched 7-pt scales) Validity checks Consistency and accuracy of reports – e.g. voting (but see Berent et al. 2010)

Results Few significant differences in refusal and protocol change comparisons But respondents recruited after 6+ calls are: – Younger and more likely to be Black, non-Hispanic – less likely to have Internet access – Less likely to be Republican and conservative – Slightly more likely to satisfice And reluctance at recruitment leads to lower cooperation in panel

Responsiveness 1-5 calls (n=1222) 6+ calls (n=516) No refusal (n=1401) Refusal (n=337) Standard recruit (n=1641) Protocol change (n=97) No. of waves started (mean) 6.34 (.04) 5.85 *** (.07) 6.25 (.03) 5.95*** (.08) 6.26 (.03) 5.11 *** (.14) No. of complete waves (mean) 6.14 (.04) 5.61*** (.07) 6.05 (.04) 5.73** (.09) 6.05 (.04) 4.82*** (.15) Completed all waves % 62.5 (1.38) 44.2*** (2.19) 59.5 (1.31) 47.2*** (2.72) 60.3 (1.20) 3.1*** (1.77) Missed 1 or more waves % 23.4 (1.21) 39.5*** (2.15) 26.1 (1.17) 36.8*** (2.63) 25.2 (1.07) 79.4*** (4.13) Dropped out %14.1 (1.00) 16.3 (1.63) 14.4 (0.94) 16.0 (2.00) 14.6 (0.87) 17.5 (3.88)

Sociodemographics Less than 6 surveys completed 4-5 surveys Completed 6-7 surveys completed Male % 46.6 * % 33.8***34.6*** % 39.9**39.7* % 26.4*25.7*31.4 White, nH % 76.5***77.4***87.6 Black, nH % 13.5***13.4**7.0 Hispanic % 7.4**7.2*4.1 Other, nH % 6.5**5.13.6

Sociodemographics Less than 6 surveys completed 4-5 surveys Completed 6-7 surveys completed < High Sch % 7.4**6.5*3.8 High Sch % 19.2**19.5*13.8 College % Bachelor % Graduate % 14.4**12.3**20.4 Northeast Midwest 23.7*28.4 South West 19.0*17.8*24.1

Recruitment Variables Less than 6 surveys completed 4-5 surveys Completed 6-7 surveys completed Internet access % 83.2**83.9*88.0 Interest in politics % ★ 37.9 † Interest in computers ★ Number of opinions ★★ Important to obey authority ★

Wave 1 variables Less than 6 surveys completed 4-5 surveys Completed 6-7 surveys completed Democrat 28.3* Independent 22.9***24.0**31.9 Republican 17.4***19.9***33.6 Liberal 22.7***24.0**32.7 Moderate Conservative 27.2***29.8***45.0 Interest in pol 33.6***38.4***55.7 Voted in ***77.0***88.4 Voted Obama 28.1***34.9**43.5

W1 attitudes

Data quality Less than 6 surveys completed 4-5 surveys Completed 6-7 surveys completed W1 MP (mean).47*.48*.45 W1 ND (mean).54*.53 W2 MP (mean).60*.61*.57 W2 ND (mean).64*.61 W6 MP (mean).34**.33*.30 W6 ND (mean).46*.46 †.44 W9 MP (mean).32*.32**.28 W9 ND (mean).47* W10 MP (mean) W10 ND (mean) W11 MP (mean).26 † W11 ND (mean).49*.49.47

Summary Level of effort analysis: – Small differences between respondents as a function of ‘effort’ required to recruit them – Significant differences in their cooperativeness at later panel waves – Significant differences in demographics, on key survey estimates, and on satisficing between more and less cooperative panel recruits – A few differences on substantive items used in satisficing indicators, but not many

Common causes Ability: – Education – Computer/Internet literacy R required MSN-TV device Motivation: – Recruitment difficulty – Interest in computers – Interest in politics Demographic characteristics – sex, age, race & ethnicity

Satisficing Recruitment difficulty Ability Motivation Demographics N of refusals N of calls to complete Education Interest in politics Interest in computers Sex Age Race Web access Response N of panels started Non-differentiation Use of midpoints Common cause? Correlated residual

SEM Estimates No covariates+Recruitment+Ability+Motivation+Demographics RespSatisRespSatisRespSatisRespSatisRespSatis Refusals-.11* *.17 Calls-.04*** ** ** **.023 Postgrad Degree.44**-3.0**.37**-1.90**.26*-1.66 College Degree.42**-2.4**.35**-1.60**.22*-.47 Some college.34**-1.8**.30** **-1.40** Interest politics.37**-7.78** ** Interest computers ** * Female-.13*-1.05** Age.01**-.01 White.40**-1.82** Web TV-.29**.50 Residual correlation-.09**-.08** RMSEA.95

Summary 2 SEM: – Very weak correlation between satisficing and propensity to respond – a ‘reassuring’ result? – Recruitment difficulty predicts response propensity but not satisficing – Motivation variables better predict satisficing; ability better predicts response. Both together can jointly account for the weak correlation between propensity to respond and satisficing.

Discussion points Limitations: – absence of external records – advance release data – specification of SEM Can we improve measures of responsiveness and satisficing (including choice of item sets)? How can we best utilize the strengths and compensate for the limitations of the panel design

Thank you

Sample composition CharacteristicPercent in sample analyzed (n = 1738) Percent in population (CPS, March 2008) Male Northeast Midwest South West or older

Sample composition CharacteristicPercent in sample analyzed (n = 1738) Percent in population (CPS, March 2008) Race/ ethnicity White, non-Hispanic Black, non-Hispanic Hispanic Other, non-Hispanic Educational attainment < High school diploma High school diploma Some college Bachelor’s degree Graduate degree

Sociodemographics 1-5 calls (n=1222) 6+ calls (n=516) No refusal (n=1401) Refusal (n=337) Standard recruit (n=1641) Protocol change (n=97) Male %  % *** ** % % *** * White, nH % ** Black, nH % ** Hispanic % Other, nH %

Sociodemographics 1-5 calls (n=1222) 6+ calls (n=516) No refusal (n=1401) Refusal (n=337) Standard recruit (n=1641) Protocol change (n=97) < High Sch %  High Sch % College % Bachelor %  Graduate %  Northeast Midwest  * South ** West

Recruitment variables 1-5 calls (n=1222) 6+ calls (n=516) No refusal (n=1401) Refusal (n=337) Standard recruit (n=1641) Protocol change (n=97) Internet access % ** Interest in politics % ★ Interest in computers ★  Number of opinions ★★ Important to obey authority ★ ★ Very or extremely ★★ About many things or just about everything

Wave 1 variables Wave 1 variables (n=1577) 1-5 calls (n=1222) 6+ calls (n=516) No refusal (n=1401) Refusal (n=337) Standard recruit (n=1641) Protocol change (n=97) Democrat Independent Republican   Liberal Moderate Conservative *  Interest in pol  Planned to vote * Voted in Voted Obama *

Data quality 1-5 calls (n=1222) 6+ calls (n=516) No refusal (n=1401) Refusal (n=337) Standard recruit (n=1641) Protocol change (n=97) W1 INR (mean) W1 MP (mean).44.47* W1 ND (mean)  W2 MP (mean) W2 ND (mean) W6 MP (mean) W6 ND (mean).44.47** W9 MP (mean)  W9 ND (mean) W10 MP (mean) W10 ND (mean)