Survey Participation: A Study of Student Experiences and Response Tendencies Allison M. Ohme, IR Analyst Heather Kelly Isaacs, Assistant Director Dale.

Slides:



Advertisements
Similar presentations
Alleghany County Public Schools Community Survey
Advertisements

Jack Jedwab Association for Canadian Studies September 27 th, 2008 Canadian Post Olympic Survey.
"Strengthening Our Institution The Power of Title III in Impacting Assessment, Distance Learning, & Advising at a Small, Rural Community College" A Dialogue.
Ajou University & University of Bristol Sister Relationship Established: 2002 Sister Relationship Established: 2002 How? Through U.K. Educational Fair.
Presented at the 2006 American Educational Research Association Annual Meeting April 7 – 11 San Francisco, CA Techniques for Generating Higher Response.
Senior Companion Performance Measure Surveys: Survey Helper Training Session 2: Giving the Survey to Clients/Caregivers in an Interview Format Trainer:
Results compiled by Iliyana Kuneva and Daniel Gross, BGS Interns BGS 2011 Focus Group Survey One: First Impressions.
Increasing Survey Cooperation: Motivating Chronic Late Responders to an Annual Survey National Science Foundation Division of Science Resources Statistics.
Solving the Faculty Shortage in Allied Health 9 th Congress of Health Professions Educators 4 June 2002 Ronald H. Winters, Ph.D. Dean College of Health.
Online Internal Grant Process: An Effective Means to Infuse and Sustain General Education Goals University of Delaware Martha Carothers Catherine Davies.
Technology Opportunity Centers Technology Innovation Challenge Grant TOC Centers opened ARC 2002 – TOC Centers opened ARC 2003.
11 Liang Y. Liu, Ph.D. Community Mental Health & Substance Abuse Services Section Texas Department of State Health Services
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Response Rate in Surveys Key resource: Dillman, D.A.,
First-Year Graduate Student Survey INTRODUCTION As part of the Graduate Schools recruitment and retention efforts, a graduate student survey was developed.
Copyright ©2010 Pearson Education, Inc. publishing as Prentice Hall
4. NLTS2 Data Sources: Parent and Youth Surveys. 4. Sources: Parent and Youth Surveys Prerequisites Recommended modules to complete before viewing this.
O N THE R OAD TO R ECOVERY : F INDINGS FROM THE ASA 2012–2013 J OB B ANK S URVEY R OBERTA S PALTER -R OTH, P H D M ICHAEL K ISIELEWSKI D EPARTMENT OF R.
Promotion and Tenure Workshop 1. Evaluation Procedure There is only one evaluation procedure leading to recommendations regarding promotion, tenure and.
New Patterns in Response Rates: How the Online Format Has Changed the Game Presented by David Nelson, PhD Purdue University.
Traditional Washington STARS Successful Solutions Professional Development LLC Orientation.
Corporate Connections An Overview of the Industry-to-Student Mentoring Program Cal Poly, SLO Multicultural Engineering Program.
What “Counts” as Evidence of Student Learning in Program Assessment?
Student Focus Groups UNIVERSITY OF KENTUCKY COLLEGE OF PHARMACY 1 Formative Evaluation Using Student Focus Groups Heidi M. Anderson, Ph.D. University of.
SNDT Women's University Introduction to Evaluation and Assessments  Presented by Kathleen (Kat) Miller, Senior Consultant with Booz Allen Hamilton  4.
Welcome to Instructions and tips for the online application process 1 June 2012.
Lauren Weiner, Ed.D. Director, Associated Students Administration University of California-San Diego Marilee J. Bresciani, Ph.D. Professor, Postsecondary.
Patient Survey Results 2013 Nicki Mott. Patient Survey 2013 Patient Survey conducted by IPOS Mori by posting questionnaires to random patients in the.
Developing and Supporting Online Learning in a Traditional UK Polytechnic University: A view from the middle Rachel Forsyth, Learning and Teaching Unit,
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Gwendolyn Archibald Higher Education & Student Affairs The University of Iowa N491 Lindquist Center EVALUATING A GRADUATE PROGRAM.
INSTITUTIONAL RESEARCH OFFICE Non-Returning Student Survey.
ANDREA BROWN DIRECTOR, PROGRAM ASSESSMENT AND INSTITUTIONAL RESEARCH How to Use Student Survey Data (NSSE) to Improve Your Teaching.
So What Happened to All of Those 20-Something Students Who Didn’t Complete Their Degree Programs? Bruce Chaloux Southern Regional Education Board.
Sharing What You’ve Learned
Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
Results of the Faculty Survey on Internationalization at Villanova: A Preliminary Report Prepared for the International Leadership Committee Prepared by.
Please note: Our website changes periodically. The screen and link examples in this presentation may appear slightly differently. Harford County Public.
Terry Eddy Cody Havard Lori L. Braa Session # - 10:25–10:55am.
A Study on Student’s Opinions of Developing a Football Team at Saint Mary’s University Abby Ayotte Jake Wanek Chris Sanstead Public Relations Research.
WASC: Street Interviews Summary of Report Prepared by Kaye Bragg, Ph.D. Director of the CSUB Assessment Center Summary of Report Prepared by Kaye Bragg,
EXTERNSHIP MEETING for SUMMER 2010 “Experience: The Real Teacher.” Jennifer Barnes, Director of the Externship Program Carolyn Landry, Externship Coordinator.
Program Review and General Education Assessment at the University at Albany: Past, Present and Future Barbara Wilkinson Assistant Director for Assessment.
Selected Results from the Robert Noyce Teacher Scholarship Program Evaluation Frances Lawrenz Christina Madsen University of Minnesota.
CAMPUS PULSE: A New Initiative to Increase Student Survey Response Rates Heather A. Kelly Assistant Director Allison M. Walters Institutional Research.
The Economic Impact of a University on its Community and State: Examining Trends Four Years Later Presented by: Allison M. Ohme Institutional Research.
LibQUAL+ Process Overview Introduction to LibQUAL+ Workshop University of Westminster, London 21st January 2008 Selena Killick Association of Research.
From Recruitment to Retention: Focusing Campus Efforts to Promote Transfer Student Success National Institute for the Study of Transfer Students January.
FAIR Best Paper: Using Technology to Efficiently and Effectively Gather Information from Recent Alumni and Employers May 2010 Association of Institutional.
CCSSE Houston Community College System Presented by Margaret Drain June 19, 2007.
Financial Aid 101 EVERYTHING YOU NEED TO KNOW ABOUT FINANCIAL AID.
Using Technology to Efficiently and Effectively Gather Information from Recent Alumni and Employers April 2009 Florida Association of Community Colleges.
California Cash for College Line-by-line, step-by-step, a brighter future for all students… January & February 2005 FAFSA Community Workshop Series.
Career Planning and Development
IAF Certification/ Registration Bodies’ Member Satisfaction Program September 19, 2003 Final Report Summary.
UHCL NSF Scholar Orientation Bun Yue: Sharon Hall: 2/2/2012.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
1 Federal Work-Study NCASFAA 2008 Fall Conference Winston-Salem Presented by Emily Bliss and Kathy Fritz.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. ACADEMIC.
A Preliminary Investigation of Student Perceptions of Online Education Angela M. Clark University of South Alabama Presented at ISECON 2003 San Diego,
Student success is the first priority. Increasing Student Response Rates Presenters  Magaly Tymms, Associate Director, Academic Effectiveness and Assessment.
TEMPLATE DESIGN © Challenges using IPEDS for examining the Early Childhood teacher preparation pipeline Abstract The purpose.
LibQUAL+™ Process Management: Using the Web as a Management Tool Amy Hoseth Massachusetts LSTA Orientation Meeting Boston, MA October 21, 2005 old.libqual.org.
Project web site: old.libqual.org LibQUAL+™ Process Management: Using the Web as a Management Tool ALA Midwinter Conference San Diego, California January.
® LibQUAL+ ® Implementation Procedures The Third Lodz [Poland] Library Conference Technical University of Lodz June, 2008 Presented by: Bruce Thompson.
Undergraduate Programs Spring Enrollment Spring 2007Spring 2008 Turfgrass Mgmt.2831 Water & Soil Resources/Environmental Soil Science 2421 Environmental.
Field of Dreams – If You Build It, Will They Come? PRESENTED BY DR. LARRY BUNCE DIRECTOR OF INSTITUTIONAL RESEARCH.
© 2012 Cengage Learning. All Rights Reserved. Principles of Business, 8e C H A P T E R 9 SLIDE 1 Career Opportunities Planning Your Career Applying for.
Roberta Spalter-Roth, PhD Michael Kisielewski
Is It Worth the Cost? The Use of a Survey Invitation Letter to Increase Response to an Survey. Brian Robertson, PhD, VP Research John Charles, MS,
9-3 Applying for Employment
Presentation transcript:

Survey Participation: A Study of Student Experiences and Response Tendencies Allison M. Ohme, IR Analyst Heather Kelly Isaacs, Assistant Director Dale W. Trusheim, Associate Director Office of Institutional Research & Planning University of Delaware June 1, 2005 AIR 2005 ~ San Diego, CA

Background University of Delaware Fall 2004 Enrollment Undergraduate: 16,548 Graduate: 3,395 Professional & Continuing Studies: 1,295 TOTAL: 21,238 Doctoral/Research – Extensive

Background (cont.) High ability student body – has been increasing over the past 5 years.

Our Past Surveys IR typically surveys undergraduates each spring. Alternate between ACT Student Opinion, NSSE, or a homegrown survey. Examples of response rates: Student Opinion 1995: 30% 1998: 26% 2002: 21% Career Plans 1996: 46% 1999: 43% 2001: 37%

A Survey about Surveys??? Declining Response Rates…Develop a systematic study to examine these issues and their relation to poor response rates Incentives? Timing of administration? Paper v. web survey?

Research Objectives Then use this information to improve student response rates of future surveys. How many survey requests typically impact an undergraduate? What factors make students likely to respond (or not respond) to a survey? Use focus groups and telephone interviews to discover:

Methodology – Survey Questions (see Appendix A) Thinking back to the previous full academic year ( ), how many surveys from any sources were you asked to complete at the University? What was the source of the survey(s)? How many surveys did you complete and return? What were the reasons that helped you decide to complete and return the survey(s)?

Methodology – Survey Questions (cont.) What were the reasons that made you decide not to complete and return a survey? How do you feel when you receive an unsolicited survey? What kind of impact do they have on you? What suggestions do you have for increasing student response rates at UD?

Methodology – Initial Research Design Random sample of: Full-time undergraduate students Continuing from previous academic year ( ) Contact students via telephone and ask the screening question: Have you received at least one unsolicited survey from the University in the past academic year? If “yes”, student was invited to participate in one of five focus groups (filling ten students/group).

Methodology – Initial Research Design (cont.) If unable to attend a focus group, the student was given the opportunity to answer the same research questions as part of our telephone survey group. Once 50 students answered the telephone survey, this portion of the methodology was closed. Incentive: two drawings for $100 gift certificates to use in downtown Newark.

Methodology – Adjusting the Research Design After only slight success in filling the focus groups: Opened the study to students answering “no” to the screening question. Drew additional sample of students who had been sent an Economic Impact Survey in Fall 2003.

Methodology – Need for an Additional Method Low focus group attendance (even after confirmations with the participants) yielded 8 students over three groups. Added third method: in-person interviews of students in the UD Student Center’s Food Court. Students answered the same questions, and were given a $5 coupon redeemable in campus Food Courts.

Total Sample Focus Group Sample (n=8) Telephone Interview Sample (n=50) In-Person Interview Sample (n=50) See complete demographic breakdown in Appendix B. Total Sample over 3 methods (n=108)

Findings In academic Year : 26% of respondents did not receive any unsolicited surveys in % received 2 or more surveys. Survey sources: Academic departments, Honors Program, Dining Services, graduate students, etc.

Findings – (cont.) How many surveys did students complete and return? 66% of the 80 students who received surveys completed/returned all surveys. 24% completed/returned some of the surveys. 10% did not complete/return any of the surveys. ~ Remember these are the reported response rates of students who volunteered to participate in this study. It is no surprise that they are higher than typical survey response rates.

Findings – (cont.) Reasons for completing and returning surveys: Desire to help UD. Survey related to students’ interest(s), or results could affect their personal experience. Students completed both and paper surveys when they had “free time” and the survey required minimal effort. When approached in-person, students find it difficult to refuse, especially when receiving an instant incentive.

Findings – (cont.) T-ShirtSchoolbooks & suppliesFree MealCandy Desirable incentives: Coupon to receive any of the above Money Any incentive students can accept immediately

Findings – (cont.) Reasons for not completing and returning surveys: Survey not of interest to the student. Annoyed by receiving so many and/or multiple survey requests. Survey seemed too complicated or required too much time/effort to complete. Impact on Students? Most students understand surveys are a normal procedure of any university or organization. However, students are frustrated after not seeing any changes or receiving any follow-up after completing past surveys.

Findings – (cont.) Suggestions for increasing response rates: Use incentives mentioned above. Tailor survey descriptions with explicit impact statements. Offer follow-up to announce results and impact. Keep surveys short and requiring little effort to understand and complete. Best time to survey = mid-semester. ~ Survey method preference ( , paper, in- person) varies by student.

Challenges in Practice Survey administration is decentralized across campus. Using multiple methods (paper/web based) for one study requires additional coordination. Students already feeling “over-surveyed”. High preponderance of SPAM in students’ UD inboxes.

Improving Response Rates Entering Student Needs Assessment 2001= 21% 2003= 15% 2004 ACT Survey= 69% 69% response rate – How did we do it?

Another Example… Career Plans Survey 2002= 48% Random sample of 25% of baccalaureate recipients 2003= 41% Random sample of 50% of baccalaureate recipients 2004= 50% Sampled entire class of baccalaureate recipients

Questions or Comments?

Thank you! Allison M. Heather Kelly Dale W.