Overview of 2010 EHC-CAPI Field Test and Objectives Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the.

Slides:



Advertisements
Similar presentations
Midland County Career & College Access Network HHSC March 2014.
Advertisements

1 Initial Plans for the Re-engineered SIPP 2010 Electronic Prototype Field Test Jason M. Fields, Housing and Household Economic Statistics Division, US.
Copyright 2010, The World Bank Group. All Rights Reserved. Agricultural Data Collection Procedures Section A 1.
Teacher Evaluation Model
1 Fieldwork logistics and data quality control procedures Kathleen Beegle Workshop 17, Session 2 Designing and Implementing Household Surveys March 31,
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
© John M. Abowd 2005, all rights reserved Household Samples John M. Abowd March 2005.
Consumer Expenditure Survey Redesign Jennifer Edgar Bureau of Labor Statistics COPAFS Quarterly Meeting March 4, 2011.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Usability Driven GUI Design Portal as a Gateway to Intranet Resources Matthew Winkel Usability Analyst.
OECD Short-Term Economic Statistics Working PartyJune Analysis of revisions for short-term economic statistics Richard McKenzie OECD OECD Short.
David Card, Carlos Dobkin, Nicole Maestas
Re-engineered SIPP – Progress Update – Survey Redesign Seminar January 20 th, 1-3pm, BLS.
Legislative Analyst’s Office Presented to: Ryan Woolsey, Fiscal and Policy Analyst CSDA/CWDA Policy Symposium March 4, 2015.
Greg Griffiths for: Rob Burnside Emma Farrell ICES 3 Montreal June 2007 A Coherent Approach to Questionnaire Design Standards, Trends, and Implementation.
Fusion GPS Externalization Pilot Training 1/5/2011 Lydia M. Naylor Research Lead.
1 Reengineering the SIPP: The Evolution of a Phoenix Joint Statistical Meetings July 30, 2007 David Johnson.
Issues Related to Data Dissemination in Official Statistics Presented at the European Conference On Quality in Official Statistics Helsinki, Finland May.
February, 2008 Intelligent Mail® Readiness. 2 Agenda Intelligent Mail® Readiness  Full Service Project Schedule  Intelligent Mail® Releases  Mailer.
1 The SIPP: The Evolution of a Phoenix Population Association of America April 17, 2008 David Johnson.
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
A Comparison of Survey Reports Obtained via Standard Questionnaire and Event History Calendar Jeff Moore, Jason Fields, Joanne Pascale, Gary Benedetto,
1 Update on The Survey of Income and Program Participation Presentation to the Association of Public Data Users Annual Conference September 25, 2009 David.
ADOPTING OPEN SOURCE INTEGRATED LIBRARY SYSTEMS Best Practices Presented by Vandana Singh, PhD Assistant Professor, School of Information Sciences University.
12th Meeting of the Group of Experts on Business Registers
District Trainer Program Helping you to plan and conduct training meetings that support effective Rotary clubs.
MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.
The 2006 National Health Interview Survey (NHIS) Paradata File: Overview And Applications Beth L. Taylor 2008 NCHS Data User’s Conference August 13 th,
Re-engineered SIPP – Progress Update – Survey Sponsors Meeting US Census Bureau January 28 th, 2010.
Fusion GPS Externalization Pilot Training 3/1/2011 Lydia M. Naylor Research Lead.
Evaluating a Research Report
Planning for 2010: A Reengineered Census of Population and Housing Preston Jay Waite Associate Director for Decennial Census U.S. Census Bureau Presentation.
2011 Census 2007 Census Test – emerging findings Garnett Compton, ONS Updated 4 September 2007 BSPS – 12 September 2007.
1 Understanding Event History Calendars ( Update on Reengineering SIPP) COPAFS March 5, 2010 David Johnson Jason Fields US Census Bureau.
OVERVIEW OF THE CENSUS BUREAU’S SURVEY DATA COLLECTION OPERATIONS AND FIELD STAFF Presentation to the Quarterly Public Meeting of the Occupational Information.
Fiscal Year (FY) 2015 National Training and Technical Assistance Cooperative Agreements (NCA) Funding Opportunity Announcement (FOA) HRSA Objective.
Evaluating HRD Programs
Longitudinal Data Recent Experience and Future Direction August 2012.
The SIPP Event History Calendar Field Test: Analysis Plans and Preliminary Report Jeff Moore Statistical Research Division, U.S. Census Bureau Jason Fields.
2008 Wisconsin County Health Rankings Online Webinar Available November 14, 2008 Kyla Taylor.
A Comparison of Survey Reports Obtained via Standard Questionnaire and Event History Calendar * Priming Evaluation * Jeff Moore, Jason Fields, Gary Benedetto,
The British Household Panel Survey Began in September 1991 National sample of England, Scotland and Wales 5,000 households/10,000 interviewed adults 16+
Copyright 2010, The World Bank Group. All Rights Reserved. Managing Data Collection Section A 1.
The ArtWorx Museum Training Program for Volunteers The heart of a volunteer is not measured in size, but by the depth of the commitment to make a difference.
Use of Administrative Data Seminar on Developing a Programme on Integrated Statistics in support of the Implementation of the SNA for CARICOM countries.
Whiteboard Zoom In Attend a train-the-trainer session on value- added models in spring or summer 2014 Get Training Develop and implement plan for distributing.
A Comparison of Survey Reports Obtained via Standard Questionnaire and Event History Calendar: Initial Results from the 2008 EHC “Paper” Test Jeff Moore.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
1 Dependent Interviewing: Evidence from Field Tests The SIPP Methods Panel Nancy Bates and Joanne Pascale U.S. Census Bureau Seminar on Dependent Interviewing.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Program Assessment Technical Assistance Meetings December 2009.
Regulations 201: Thorny Issues What is Research? Exempt and Expedited Reviews.
December 15, 2015 Welcome to the Wisconsin Test Administration Training and Q&A Session The ACT and ACT WorkKeys Please Call: (844) An operator.
Chapter Eight Questionnaire Design Chapter Eight.
Project financed under Phare EUROPEAN UNION QUALITY EXTERNAL MONITORING IN THE SCHOOL YEAR 2007 – 2008 CONCLUSIONS AND RECOMMENDATIONS Material produced.
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
Introduction to the Pennsylvania Kindergarten Entry Inventory.
February 2, 2016 Welcome to the Wisconsin Test Administration Training and Q&A Session The ACT and ACT WorkKeys Please Call: (866) Access Code:
SUITLAND WORKING GROUP: Task Force on Improving Migration and Migrant Data Using Household Surveys and Other Sources Eric B. Jensen Population Division.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
District Validation Review (DVR) Nonpublic School Preparation Information Division of Special Education.
Kate Fairweather CMCAust Marketing QAC Assessor & IQA
Review of proposed outline for handbook
NYSDOH AIDS Institute Quality of Care Program eHIVQUAL
2017 Economic Census Update
Vineland Public Schools Our PARCC Experience
Data Collection techniques Marina Signore
Multi-Mode Data Collection Approach
Multi-Mode Data Collection Approach
Multi-Mode Data Collection
Presentation transcript:

Overview of 2010 EHC-CAPI Field Test and Objectives Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the ASA/SRM SIPP Working Group November 17, 2009

“Re-SIPP” Development * Following successful completion of the EHC Paper Field Test

“Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument

“Re-SIPP” Development * Following successful completion of the EHC Paper Field Test * Develop the 2010 plan to test an electronic EHC instrument * Broad involvement across Census Bureau - DID- FLD - TMO - DSD- HHES - DSMD- SRD

Primary Goals of 2010 Test

(1) Strong evidence of comparable data quality

Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel?

Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs

Primary Goals of 2010 Test (1) Strong evidence of comparable data quality - How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel? - Especially for income transfer programs (2) Strong evidence to guide development and refinement before implementation in 2013 as the production SIPP instrument

Basic Design Features (1)

 8,000 Sample Addresses

Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities

Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum

Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected

Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design

Basic Design Features (1)  8,000 Sample Addresses - could have been larger! - enough sample and budget to support research and field activities “High Poverty” Sample Stratum - to evaluate how well income transfer program data are collected State-Based Design - likely (possible?) access to admin records

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin excludes 57 IL addresses in KC-RO

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin excludes 57 IL addresses in KC-RO DALTexas Louisiana 1, ,707

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin excludes 57 IL addresses in KC-RO DALTexas Louisiana 1, ,707 LACalifornia 2,407excludes 445 CA addresses in SEA-RO

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin excludes 57 IL addresses in KC-RO DALTexas Louisiana 1, ,707 LACalifornia 2,407excludes 445 CA addresses in SEA-RO TOTAL N: 7,982

ROStateSample NNotes BOSConnecticut Massachusetts New York Rhode Island ,155 covers upstate (non-NYC) NY NYNew York 1,681covers NYC portion of NY PHILMaryland 280 CHIIllinois Wisconsin excludes 57 IL addresses in KC-RO DALTexas Louisiana 1, ,707 LACalifornia 2,407excludes 445 CA addresses in SEA-RO TOTAL N: 7,982 TOTAL ADMIN RECS (?) N: 6,736

Basic Design Features (2)

Field Period: Early Jan - mid March 2010

Basic Design Features (2) Field Period: Early Jan - mid March collect data about calendar year 2009

Basic Design Features (2) Field Period: Early Jan - mid March collect data about calendar year 2009 Field Representative training in Dec/Jan

Basic Design Features (2) Field Period: Early Jan - mid March collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training

Basic Design Features (2) Field Period: Early Jan - mid March collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience

Basic Design Features (2) Field Period: Early Jan - mid March collect data about calendar year 2009 Field Representative training in Dec/Jan - goal: minimize # of FRs with post-training “down-time” - evaluation and improvement of training Use FRs with a wide range of experience Expand RO involvement

Research Agenda

1. Quantify likely cost savings

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs”

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC)

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 1. Quantify likely cost savings

Special Methods 1. Quantify likely cost savings - new cost code(s) established - timing interview length - exchange between 12-month recall and 3 interviews per year

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 2. Test the data processing system

Special Methods 2. Test the data processing system The data collected in this test will be used to develop and test a new data processing system.

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 3. Evaluate data quality

Special Methods 3. Evaluate data quality - administrative records

Special Methods 3. Evaluate data quality - administrative records - recording of selected interviews

Special Methods 3. Evaluate data quality - administrative records - recording of selected interviews - extract SIPP 2008 panel data; compare CY2009 estimates from the two surveys

Special Methods 3. Evaluate data quality - administrative records - recording of selected interviews - extract SIPP 2008 panel data; compare CY2009 estimates from the two surveys

(Details) Interview Recording

- close-to-RO FRs (approximately 80)

(Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb)

(Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews”

(Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews” - with consent; adults only (21+)

(Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews” - with consent; adults only (21+) - record R’s entire continuous “turn”

(Details) Interview Recording - close-to-RO FRs (approximately 80) - 3 recording windows (early Jan, late Jan, mid Feb) - message: “record the next two interviews” - approximately 480 recorded interviews - with consent; adults only (21+) - record R’s entire continuous “turn” - in RO, with the assistance of the ROCS transfer recordings to the secure HQ network

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid)

Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block

Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions

Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

(Details) R Debriefing Block

- at end of interview (status=“complete”)

(Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:

(Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid

(Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:

(Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”

(Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”  “did you read [X]?”

(Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”  “did you read [X]?”  “did [X] have [+/-/0] impact?”

(Details) R Debriefing Block - at end of interview (status=“complete”) - focus on “field support” materials:  advance letter, brochure, calendar aid - very brief question set:  “did you see [X]?”  “did you read [X]?”  “did [X] have [+/-/0] impact?” - with most convenient respondent

Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

Special Methods 4. Evaluate “field support” materials (advance letter, brochure, calendar aid) - Respondent debriefing instrument block - FR debriefing sessions - recording of selected interviews

(Details) FR Debriefings

- at (or near) end of field period

(Details) FR Debriefings - at (or near) end of field period - at least one session per RO

(Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs

(Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs - guided 2-3 hour discussion

(Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs - guided 2-3 hour discussion - wide range of issues – e.g., training, EHC procedures, usability, interview “process” issues, etc.

(Details) FR Debriefings - at (or near) end of field period - at least one session per RO - with 8-10 FRs/SFRs - guided 2-3 hour discussion - wide range of issues – e.g., training, EHC procedures, usability, interview “process” issues, etc. - improvements for 2013

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 5. Evaluate FR training

Special Methods 5. Evaluate FR training - recording of selected interviews

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

(Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:

(Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)

(Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation

(Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation  FR preparedness/training

(Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation  FR preparedness/training  R interest/engagement

(Details) HQ/RO Interview Observations - intensive HQ/RO observation of field test - key observation themes:  use of EHC techniques (landmarks, cross-domain referencing, calendar aid)  instrument usability/navigation  FR preparedness/training  R interest/engagement - R debriefing regarding landmarks

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

Special Methods 5. Evaluate FR training - recording of selected interviews - certification (and other) testing - HQ (and RO) training observation - HQ (and RO) interview observation - FR debriefing sessions - FR feedback instrument block - FR training assessment form - Trainers’ debriefing

(Details) FR Feedback Block

- at end of interview (status=“complete”)

(Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:

(Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)

(Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)  EHC instrument bugs

(Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)  EHC instrument bugs  perceived +/- R reactions

(Details) FR Feedback Block - at end of interview (status=“complete”) - brief set of Qs about:  use of EHC methods (domains; success)  EHC instrument bugs  perceived +/- R reactions  training gaps

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 6. Identify & document instrument “bugs”

Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations

Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions

Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block

Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - item-level notes

Special Methods 6. Identify & document instrument “bugs” - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - item-level notes

(Details) Item-Level Notes

- accessible throughout Blaise interview

(Details) Item-Level Notes - accessible throughout Blaise interview non-calendar sections standarized Q “script”

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs  FR “work-arounds”

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs  FR “work-arounds”  missing help screens

(Details) Item-Level Notes - accessible throughout Blaise interview - FR training will encourage & instruct - focus on “bugs” – instrument not working as planned, e.g.:  wrong/missing fills  garbled wording  wrong/missing Qs  FR “work-arounds”  missing help screens  confusing/inapp./redundant/etc. Qs

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 7. Identify “interview process” issues

Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs)

Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations

Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations - FR debriefing sessions

Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block

Special Methods 7. Identify “interview process” issues (interview “flow,” R interest/engagement, EHC interaction, mix of structured/unstructured Qs) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews

Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC) HOW CAN WE IMPROVE FOR 2013?

Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC)

Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations

Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions

Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block

Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews

Special Methods 8. Identify usability issues (esp. EHC) (instrument navigation, FRs’ ability to access and use special features of the EHC) - HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews - FR testing sessions at HQ

Summary: Research Agenda 1. Quantify likely cost savings 2. Test the data processing system 3. Evaluate data quality 4. Evaluate “field support” materials 5. Evaluate FR training 6. Identify & document instrument “bugs” 7. Identify “interview process” issues 8. Identify usability issues (esp. EHC)

Summary: Research Agenda Lots of Extra “Stuff” – 2010 Test is Loaded - Data quality - Instrument quality - Training quality

Summary: Research Agenda Lots of Extra “Stuff” – 2010 Test is Loaded - Data quality - Instrument quality - Training quality GOAL: Fully Exploit the Test’s Information Potential

Summary: Research Agenda Lots of Extra “Stuff” – 2010 Test is Loaded - Data quality - Instrument quality - Training quality GOAL: Fully Exploit the Test’s Information Potential Improvements/Refinements for 2013

What’s Missing from 2010?

- Attrition/mover effects in an annual interview

What’s Missing from 2010? - Attrition/mover effects in an annual interview - Year to year data quality - seams between waves of a 12-month reference period interview

What’s Missing from 2010? - Attrition/mover effects in an annual interview - Year to year data quality - seams between waves of a 12-month reference period interview - Wave 2+ instrument and procedures

What’s Missing from 2010? - Attrition/mover effects in an annual interview - Year to year data quality - seams between waves of a 12-month reference period interview - Wave 2+ instrument and procedures - In Development – 2011 / 2012 Testing Plans

Thanks! Questions? contact: