Download presentation
Presentation is loading. Please wait.
Published byIrma French Modified over 9 years ago
1
Data Editing in a Common Internet Data Collection System Work Session on Statistical Data Editing 21-23 April 2008 Topic v: Editing of data acquired through electronic data collection WP 3
2
Overview Functional requirements for an enterprise-wide IDC Vision Three part focus Three functional areas and related principles Interviews and literature search High level recommendations Detailed functional requirements on “front end” editing ISMS Authoring tool Rendered form and reporting Current ISMS status and conclusion
3
Background & Vision Multiple IDCs at EIA Strategic Plan initiative for a common Internet-based data collection system Interoffice team was formed to get an agency-wide perspective and buy-in Vision: versatile and flexible, integrated collection framework inclusive of authoring tools
4
Three Part Focus The work focused on three parts: 1) The development of IDC “principles” 2) The research to determine the best practices and approaches relevant to these EIA principles 3) The detailed key requirements for a successful application at EIA What is required not how to satisfy those requirements
5
Principles & Dimensions IDC Team’s work first centered on the development of IDC “principles” Principles are generalizations which were accepted and used as the basis for reasoning These principles represented three views— the respondent’s, the survey program, and the EIA corporate Important characteristics were then agreed upon from each of the view points
6
Respondent Point-of-View Ease of access and use; 24-7 access Ready and useful help and prompt support Ability to easily report large amounts of data Reduced burden compared to other modes Confidentiality protection Ability to interrupt and complete surveys later, save/print/forward, finalize, and transmit information in both directions Other incentives for respondents (What’s in it for me?/”first results”)
7
Survey Program Point-of-View Reductions in cycle time, and costs associated with data entry, mailing, and editing follow-up A specific survey instrument application that makes use of shared infrastructure Instrument design and flexibility Ability to directly add/delete/modify surveys by the program office Flexibility and ease of changing/modifying survey definitions, instructions, forms, respondent sets or other metadata directly by program offices
8
Survey Program Point-of-View Reduced reporting errors by editing at data capture; ease and flexibility of adding/modifying and resolving edits and monitoring edit performance Ability to process resubmission/revisions Respondent case and survey management Ability to measure process performance real time Ease of integration and synchronization with other reporting modes and systems
9
EIA Corporate Point of View A centralized platform Shared architecture and infrastructure One entry point Common look and feel Security (for EIA) Support multiple surveys 508 compliant Life cycle records’ management
10
Research: Visits to Statistical Agencies and Main Finding 11 broad questions Visits: BLS, Census, NASS, Westat/Blaise, EIA Main finding--three approaches: 1) fully centralized, limited flexibility 2) centralized core structure w/survey flexibility and/or control 3) fully de-centralized
11
Visits to Statistical Agencies and Literature Search—Other Findings Other findings: Little evidence of cost savings; evidence of higher quality data; implementations are few; take-up is low; help support underestimated; organizational culture and buy-in is critical The highlighted best practice in the literature search was usability testing
12
Recommendations: Overview Recommend: Follow 4 basic guiding rules supported by detailed requirements. The IDC: 1) is designed so that it is easy to implement, modify, or migrate a survey to the IDC 2) provides tools enabling the survey group to have ownership of their survey application 3) provides value added from the respondent’s perspective (answers “What is in it for me?) 4) promotes high quality data through editing features, user notifications, clear navigation, etc
13
Detailed Key Functional Requirements Framework for developers to construct a design document for EIA’s next generation of internet data collection The focus on what is required not how to satisfy those requirements. Detailed requirements by ten categorizies Categories most related to editing: Graphical User Interface (single sign-on, common look and feel, and functionality), Front-end editing, User notification, Performance measures
14
Data Collection (GUI - Common Look and Feel & Functionality) Users have a common authentication and authorization approach; enroll once and further survey access is automated Elements have a common look and feel-- respondents will not have to relearn, and will always know they are on an EIA site Survey manager chooses what to use but uses the common look if item is chosen
15
Data Collection (GUI - Common Look and Feel & Functionality) The SUBMIT button invokes a final verification of unedited or unresolved fields Edit failures displayed with links to the failed item Respondents may import data into the IDC Editing still performed if data imported If the data pass all edits or all hard edit failures have been resolved, the data are sent to EIA
16
Data Collection (“Front End” Editing) Prevent errors through drop downs Provide capability for “hard” edits for defined fields —must correct (ex: numeric, integer, field length, required field) Provide capability for other “hard” edits defined by survey —must correct or comment Provide capability for “soft” edits —can correct, comment or bypass
17
Data Collection (“Front End” Editing) Provide capability for various types of edits: within cell, cell-to-cell within form, cell-to-cell historic, cell-to-cell other forms/surveys Provide ability to implement edit rules that include consistency, range, comparisons, etc., that may involve calculations and/or parameters that can easily be modified by survey managers
18
Data Collection (“Front End” Editing) Survey managers create edit failure messages using best practices guidance All navigation objects first save all fields on the current screen, edit appropriate fields and display messages regarding user action before displaying the next screen Edit failures displayed with links to the failed item Submissions with data that pass all edits or have resolved hard edit failures are sent to EIA
19
Data Collection (User Notification) Provide dual notification that submission received by EIA—on-screen at submission, follow-up email Provide non-response notification: no submission when expected, submission insufficient Provide email to targeted sub-groups Provide notification of data not saved/submitted if browser closed prior to that action.
20
Performance Measures Measures about the Survey 1) Response Measures 2) Edit Failure & Correction/Bypass Measures 3) Corrections/Resubmissions Measures Measures about the Collection Mode 1) Access counts (respondents and internal) 2) Change in respondent burden, respondent evaluation/usability, support costs, number of registered users Measures on Contribution of IDC Mode to Overall Survey Product Performance 1) Comparison of: cost per respondent, costs to other modes, response rates to other modes, “back-end” edit failure/call back/corrections to other modes, contribution to meeting/exceeding dissemination deadlines
21
The Authoring Tool… Provides the common look and feel Provides the ability to easily implement, modify, or migrate a survey to the internet collection Provides the ability for the survey group to have ownership of their survey application Provides a standard set of parameter driven editing features to promote high quality data
22
Internet Survey Management System (ISMS): Authoring Tool
23
ISMS Authoring Tool Step to Define Edit—Element Specific
24
ISMS Authoring Tool Step to Define Edits—Element Specific
26
ISMS Authoring Tool Step to Define Edits—Inter-element
27
Types of Edit Rules for ISMS
28
ISMS Rendered Survey Form
30
ISMS Rendered Survey Form—Edit Warning Message
31
Expected Benefits Key benefits expected for all three view points: Customers: ease of access and use; 24-7; data import; one look and feel; incentive--get something back Survey programs: Reduced collection and processing costs, control and flexibility over survey elements and structure without responsibility for infrastructure; higher quality data through the editing function Corporate: A centralized platform, shared architecture and infrastructure to reduce development and maintenance costs; consistency in process across the organization
32
Conclusion EIA is developing the ISMS inclusive of a custom authoring tool to provide both a common look and feel and survey flexibility The authoring tool provides ease in implementing common edits Usability testing on rendered forms with internal users and respondents to be performed soon Implementation of the first survey expected next month Development has been critical path based Other features specified in the requirements will be implemented in later versions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.