Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana.

Slides:



Advertisements
Similar presentations
Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,
Advertisements

Costs Per Survey Response
4. NLTS2 Data Sources: Parent and Youth Surveys. 4. Sources: Parent and Youth Surveys Prerequisites Recommended modules to complete before viewing this.
Performance Tasks for English Language Arts
What is a Survey? A scientific social research method that involves
Refresher Instruction Guide Strategic Planning and Assessment Module
Postsecondary Education Sample Studies and Data Tools Susan Aud, Ph.D. National Center for Education Statistics Institute of Education Sciences U.S. Department.
Coverage error Survey Research and Design Spring 2006 Class #3.
Assessment Institute October 2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary Research, Indiana University Assessment with.
American College Personnel Association March 31, 2014 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University.
Chapter 11: Collecting Data by Communication. Key Issues for Collecting Information by Communication.
7.Implications for Analysis: Parent/Youth Survey Data.
Using SNAAP Data for Positive Change 3 Million Stories March 8-9, 2013 Sally Gaskill, Director Amber D. Lambert, Research Analyst Angie L. Miller, Research.
NETMCDO 2015 Sally Gaskill Indiana University Center for Postsecondary Research The Debt Issue: What do SNAAP Data Say?
SOCIAL SURVEY RESEARCH: AN OVERVIEW OF SURVEY RESEARCH PRINCIPLES FOR CONSUMERS AND DECISION MAKERS.
Chapter 13 Survey Designs
Questionnaire Design.
Chapter 13 Survey Designs
研究方法論課程報告 報告人:余惟茵 指導老師:任維廉教授
Association for Institutional Research Annual Forum May 2014 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary Research, School of.
American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University.
2/9/00 Survey Methodology Survey Design EPID 626 Lecture 5.
Survey Designs EDUC 640- Dr. William M. Bauer
Abstract Institutions may be interested in using alumni career success as evidence of institutional effectiveness, but the current study suggests that.
UOFYE Assessment Retreat
Barbara Hauptman Sally Gaskill Carly Rush AAAE June 1, 2012.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
National Association for Gifted Children 61 st Annual Convention November 16 th, 2014 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary.
CAHPS Overview Clinician & Group Surveys: Practical Options for Implementation and Use AHRQ ANNUAL MEETING SEPTEMBER 18, 2011 Christine Crofton, PhD CAHPS.
Evaluating NSF Programs
Professional Development Day October 2009 Data Matters! Finding and Accessing Information at SPC.
Institutional Assessment Day: Program-Level Alumni Survey Data August 19, 2014 Pat Hulsebosch Associate Provost – Office of Academic Quality Rosanne Bangura.
Business and Management Research
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Survey Research and Other Ways of Asking Questions
WRITING STUDENT LEARNING OUTCOMES October What is Assessment?  It is the systematic collection and analysis of information to improve student learning.
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration – Initiated in 2003 – Low-residency—one weekend per month (over.
Are those Rose-Colored Glasses You are Wearing?: Student and Alumni Survey Responses Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Association for the.
6. Implications for Analysis: Data Content. 1 Prerequisites Recommended modules to complete before viewing this module  1. Introduction to the NLTS2.
Using Technology to Efficiently and Effectively Gather Information from Recent Alumni and Employers April 2009 Florida Association of Community Colleges.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Northcentral University The Graduate School February 2014
Chapter 12: Survey Designs
39 th International Conference on Social Theory, Politics, and the Arts October 25, 2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary.
LEARNING OUTCOMES & SNAAP ATHE Leadership Institute Montreal, July 2015 Sally Gaskill, Director Strategic National Arts Alumni Project Indiana University.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
Data Collection Methods
Cultivating Security: Cultivating Security: Estate Planning for New Mexico Farm and Ranch Families Amy Lamb New Mexico State University Undergraduate Research.
Panel Study of Entrepreneurial Dynamics Richard Curtin University of Michigan.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Study question: A common difficulty of psychiatry residency training programs is determining how to address and quantify culture or administration problems.
Designing Survey Instruments. Creating a Survey Instrument  Survey instruments should help researchers collect the most accurate data and reach the most.
1 Non-Response Non-response is the failure to obtain survey measures on a sample unit Non-response increases the potential for non-response bias. It occurs.
Quality Assurance Programme of the Canadian Census of Population Expert Group Meeting on Population and Housing Censuses Geneva July 7-9, 2010.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Survey Methodology Survey Instruments (1) EPID 626 Lecture 7.
Instructional Technology Survey: Highlands School District Shawn Cressler, Summer 2013.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Topic (i): Editing nearer the source Work Session on Statistical Data Editing Vienna, Austria April 2008.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
NEEDS ASSESSMET Primary Data for Needs Assessment.
1 ASN Partners with SNAAP: An Opportunity for Arts High Schools Presenters: Sally Gaskill, SNAAP Director and Rebecca F. Houghton, SNAAP Project Coordinator.
Component D: Activity D.3: Surveys Department EU Twinning Project.
DEVELOPING A CAREER Sports and Entertainment Marketing.
Division of HIV/AIDS Managing Questionnaire Development for a National HIV Surveillance Survey, Medical Monitoring Project Jennifer L Fagan, Health Scientist/Interview.
Business and Management Research
Annual Longitudinal Assessment
Business and Management Research
Presentation transcript:

Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University Innovations in Collecting and Reporting Complex Survey Data

Background & Purpose There is an increasing trend for requiring colleges and universities to show measures of their effectiveness (Kuh & Ewell, 2010), and surveys are a common way to accomplish this Online data collection allows researchers to incorporate several programming-based components into surveys (Dillman, 2007)

Background & Purpose Complex survey features include: Skip logic - respondents receive follow-up questions based on answers to filter questions Populating response options - response options available are based on answers to earlier questions Filling in question stems Java-enabled elements to prevent inconsistent responses Intended to ease process of taking the survey from the respondent's perspective But can complicate data management and reporting for the researcher

Examples from the Strategic National Arts Alumni Project (SNAAP)

SNAAP As an example, we will discuss some of the ways that the Strategic National Arts Alumni Project (SNAAP) handles and reports complex data What is SNAAP? On-line annual survey designed to assess and improve various aspects of arts-school education Investigates the educational experiences and career paths of arts graduates nationally Findings are provided to educators, policymakers, and philanthropic organizations to improve arts training, inform cultural policy, and support artists

Who does SNAAP survey? Participants drawn from: Arts high schools Independent arts colleges Arts schools, departments, or programs in comprehensive colleges/universities Over 5 years, SNAAP has been administered at nearly 300 institutions of various focuses, sizes, and other institutional characteristics Cohort Year Sampling 2008 and 2009 Field Tests: 5, 10, 15, & 20 years out 2010 Field Test: 1-5, 10, 15, & 20 years out 2011 and forward: all years to generate the most comprehensive data possible

Increasing Numbers… 2010 Field Test Over 13,000 respondents 154 Institutions 2011 Administration More than 36,000 respondents 66 institutions 2012 Administration More than 33,000 respondents 70 institutions Now able to combine 2011 and 2012 respondents to create a “SNAAP Database” with over 68,000 respondents

Questionnaire Topics Formal education and degrees Institutional experience and satisfaction Postgraduate resources for artists Career Arts engagement Income and debt Demographics

Data Management Issues: Skip Logic Treating skip logic as a valid data point If a respondent does not receive a question due to his answer on a previous question, leaving him as “missing” on the follow-up question can have erroneous implications for the data

Data Management Issues: Skip Logic SNAAP example If answer “No” to this item: Then do NOT receive this item:

Data Management Issues: Skip Logic SNAAP example Asking about length of time spent working as an artist would not make sense for someone who has not worked as an artist But leaving data point as “missing” does not differentiate the non-artists from: Those who saw the question and chose not to answer it (non-response), and Those who did not complete the survey (break-off)

Data Management Issues: Skip Logic SNAAP example To address this issue, we developed a system for coding those who did not receive questions due to skip logic with valid response values Assigned them negative values (-1, -2, etc.) Multiple negative values if multiple reasons why respondents might not receive the question Non-response and break-off cases left as missing data points

Data Management Issues: Skip Logic SNAAP example What about those who dropped out of the survey, but would not have received the follow-up questions based on their earlier answers? Incorporated into coding system a rule for missing values for all questions past each respondent’s break-off point, regardless of answers to filter questions Do not want fluctuations in number of valid responses as progress through survey instrument (should be decreasing as people drop out)

Data Management Issues: Inconsistent Responses A respondent might answer a filter question one way, and then receive (and answer) the follow-up question, but back up in the survey and change the filter question (so they should NOT have received the follow-up) SNAAP example: Respondent says they are currently a professional artist, and answer “less than 1 year” for length, then go back and change their response to say that they have never been a professional artist Which response is the right one?

Data Management Issues: Inconsistent Responses SNAAP example: Imposed a rule that keeps the most “recent” response as accurate This necessitates altering original responses on follow-up questions that are no longer relevant Would make the “less than one year” response into a negative value (did not receive due to skip logic)

Reporting Issues: Skip Logic Because valid response values are assigned to follow- up questions on which some respondents were skipped, when reporting frequencies for each question, it is imperative to visually notate (Sanders & Filkins, 2009) which values are assigned due to skip logic, and which values are responses on the survey itself Keeping skip logic values allows one to make statements about the entire sample, rather requiring an “of those who…” statement throughout the report

Reporting Issues: Skip Logic SNAAP Example: Italicized skip logic labels in reports

Reporting Issues: Skip Logic Also have option of presenting only frequencies for those who received the question and leaving those skipped as missing Problematic if consumers of the report do not thoroughly read these statements (Suskie, 1996) then may easily misrepresent the survey results Reporting those skipped on the question prevents distortion of data

Reporting Issues: Skip Logic SNAAP Example: Only provide PDF version of this report to deter copying/pasting without accompanying introduction language

Reporting Issues: Skip Logic Questions that populate from previous “check all” lists should also be treated like other questions using skip logic SNAAP Example: Asks respondents to check all that apply for a list of jobs associated with the arts in which they have EVER worked Also do the same for a list of jobs outside of the arts Then only those they selected appear in a later question, this time asking them which of these they CURRENTLY work

Reporting Issues: Skip Logic SNAAP Example: Jobs associated with the arts EVER

Reporting Issues: Skip Logic SNAAP Example: CURRENT jobs

Reporting Issues: Skip Logic SNAAP Example: What appeared as a “check all” in the survey is presented like other “radio button” items

Importance of Codebook Integrating skip logic into reporting highlights the need to maintain a detailed codebook for reference Dynamic surveys should include several components in their codebook: Who receives each question Which response options are populated based on previous answers Which places have individual words/phrases filled in a question stem

Importance of Codebook SNAAP example:

Importance of Codebook SNAAP example:

References Dillman, D. (2007). Mail and internet surveys: The tailored design method (2 nd ed.). New York: Wiley. Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), Sanders, L. & Filkins, J. (2009). Effective reporting (2 nd ed.). Tallahassee, FL: Association for Institutional Research. Suskie, L.A. (1996). Questionnaire survey research: What works (2 nd ed.). Tallahassee, FL: Association for Institutional Research.

Questions or Comments? Contact Information: Angie L. Miller Amber D. Lambert Strategic National Arts Alumni Project (SNAAP) (812)