Conducting Surveys at Cornell University Marin Clarkberg, Associate Director Institutional Research and Planning Yasamin Miller, Director Survey Research.

Slides:



Advertisements
Similar presentations
Introduction to Service-Learning for Students
Advertisements

Challenges to Building Institutional African Research Collaborations in a Global Knowledge Society: A case of Makerere University Mathias Bwanika Mulumba.
May 5, 2015 Strategies for Evaluation Data Collection Eric Graig, Ph.D.
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
The Purpose of Action Research
Risk Analysis So how do we choose who gets monitored?  HQ gives each Field Office a monitoring goal they must meet.  Each year HUD reviews all of its.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
The Survey as an Assessment Method: Why, when and how surveys provide evidence to inform decision-making Carrie Towns, Office of Institutional Research.
The Academic Assessment Process
Southampton Education School Southampton Education School Dissertation Studies Rigour, Ethics, & Risk.
UOFYE Assessment Retreat
Assessing Financial Education: A Practitioner’s Guide December 2010.
Multiple Indicator Cluster Surveys Data Dissemination and Further Analysis Workshop Data Archiving MICS4 Data Dissemination and Further Analysis Workshop.
Copyright © 2014 by The University of Kansas Choosing Questions and Planning the Evaluation.
Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Multiple Vantage Points for Employment-Related Feedback Share.
Emily Lynn Grant Administrator Office of Sponsored Projects and Research Administration.
Involving the Whole Organization in Creating or Restructuring a Volunteer Program Louise DeIasi DeCava Consulting.
Program Evaluation Using qualitative & qualitative methods.
Conservation Districts in New York Training Module 1.
1 The Orca Institute Governance for Patient Safety TM Leading Practice Staff Shared Patient and Caregiver Stories Leading Practice Impact HIGH - Time SHORT.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Chapter 6. Researching Your Subject © 2012 by Bedford/St. Martin's1 Understand the differences between academic and workplace research: In academic research,
Slides from a workshop at the annual conference of the American Theological Library Association, New Orleans, June 2014 TD Lincoln.
Getting Started Conservation Coaches Network New Coach Training.
IRB Belmont Report Federally mandated Risks -> minimized & reasonable Informed consent rospective Approval Prospective Approval Monitoring for subject.
Collecting Information via the Web Stephen Porter Director of Institutional Research Michael Roy Director of Academic Computing Services.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
New Identity Theft Rules Rodney J. Petersen, J.D. Government Relations Officer Security Task Force Coordinator EDUCAUSE.
Quality Assessment July 31, 2006 Informing Practice.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Systems Life Cycle. Know why it is necessary to evaluate a new system Understand the need to evaluate in terms of ease-of- use, appropriateness and efficiency.
Safeguarding Research Data Policy and Implementation Challenges Miguel Soldi February 24, 2006 THE UNIVERSITY OF TEXAS SYSTEM.
Electronic Records Management: A New Understanding of Policy, Compliance, and Discovery Robert J. Sobie, Ph.D. Director Information Systems Department.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Data Generation.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
1 Non-Response Non-response is the failure to obtain survey measures on a sample unit Non-response increases the potential for non-response bias. It occurs.
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
Update on SSN Remediation and 1-Card December 8, 2005.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Collection and Analysis of Data CPH 608 Spring 2015.
Evaluating Survey Items and Scales Bonnie L. Halpern-Felsher, Ph.D. Professor University of California, San Francisco.
Instructional Technology Survey: Highlands School District Shawn Cressler, Summer 2013.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
TOM WEKO NATIONAL CENTER FOR EDUCATION STATISTICS, POSTSECONDARY, ADULT, AND CAREER EDUCATION 30 JUNE 2010 Nationally Representative Sample Surveys: What.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of the MICS Process.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
SURVEY DESIGN: MOTIVATION UNDERSTANDING & EASE OF COMPLETION Damon Burton University of Idaho University of Idaho.
1 Session 14. Getting Started Drug and Therapeutics Committee.
Educational Outcomes Service Group: Overview of Year One Lynne Tomasa, PhD May 15, 2003.
The Roles Evaluators Play in Providing TA to SPDG Projects 1 Cheryl Leever Huffman C L Huffman & Associates 3316 Eton Avenue Oklahoma City, OK 73122
Choosing Questions and Planning the Evaluation. What do we mean by choosing questions? Evaluation questions are the questions your evaluation is meant.
Math Studies IA Criteria B Period 4 -Andrea Goldstein -Nicole DeLuque -Ellora Balmaceda -Jose Zuleta -Alec Ramirez.
Research Ethics Office of Research Compliance. Responsible Conduct of Research (RCR) Covers 9 content areas –Animal Subjects (IACUC) –Human Subjects (IRB)
ARL New Measures —a View from Cornell Anne R. Kenney ARL Survey Coordinators and SPEC Liaisons Meeting ALA June 2007.
Doing Assessment, not just talking about it ISETL conference October 2009 Philadelphia, PA Steve Culver Office of Academic Assessment Penny Burge Educ.
Practical IT Research that Drives Measurable Results 1Info-Tech Research Group Establish an Effective IT Steering Committee.
6/14/2016 NON-PROFIT BASICS by Jerry Appleby 1 HOW TO START AND RUN A CMC WITHOUT LOOSING YOUR MIND Compassionate Ministries.
EECS David C. Chan1 Computer Security Management Session 1 How IT Affects Risks and Assurance.
From Question to Action: Creating In-House Surveys as a part of Data Driven informed Decision Making David Consiglio EDUCAUSE Connect april 22, 2015.
Rachel Vis-Visschers & Vivian Meertens QDET2, 11 November 2016
Leigh E. Tenkku, PhD, MPH Department of Family and Community Medicine
Records Management New policies and procedures
Youngwummin: Ethics and Data Collection Methods
Institutional Effectiveness Presented By Claudette H. Williams
Collecting and Using Archival Data
Understand the differences between academic and workplace research:
Presentation transcript:

Conducting Surveys at Cornell University Marin Clarkberg, Associate Director Institutional Research and Planning Yasamin Miller, Director Survey Research Institute

Who We Are Institutional Research and Planning (IRP) Within the Division of Budget and Planning; goal to inform institutional decision-making Administers regular, large-scale surveys to students and other University constituencies Survey Research Institute (SRI) Full service survey enterprise at Cornell Designed, hosted, and analyzed hundreds of surveys for non-profit, government, corporate, and Cornell clients

Rationales for surveys Why do a survey? Increasing call to have “real data” and to assess processes and outcomes Looks easy (and inexpensive… and maybe even fun) Why not do a survey? 1.Lots of data is already available 2.Survey fatigue 3.Not as easy at it might seem at first 4.Serious limitations on what survey data can actually tell you about processes and outcomes

Why you shouldn’t: 1. Data already exists Everyone has some data… Has existing data been thoroughly analyzed and understood? In addition, the University archives an enormous amount data about students (and other constituencies) Students’ academic records IRP Surveys, other surveys

Why you shouldn’t: 2. Survey fatigue Surveys are more common IRP surveying regularly since 2000 CIT-hosted surveys, e.g. WebSurveyor: oOver 1600 surveys in Survey response rates are down Senior Survey response rates: 61% in 1998; 50% in 2002; 45% in 2006 Student (staff, faculty) time is a university resource

Why you shouldn’t: 3. There is such a thing as survey expertise Question design Bad questions give you bad data Instrument design Respondents often bail out of unsatisfactory surveys Sample design Samples are often adequate to the task Sampling saves all kinds of resources

Why you shouldn’t: 4. Survey research has major limitations Ability to generalize with survey data a function of response patterns Respondents may differ in important ways from nonrespondents Surveys cannot demonstrate causation

What are the alternatives to surveys? Ransack existing sources of data Multiple sources of data help “triangulate” If nothing else, you can learn more about what you don’t know (and thus what remains to be learned) Look at alternate modes of collecting data Some questions better addressed with observation, focus groups, interviews, etc.

So when is a survey appropriate? Existing data is well-understood Unanswered questions clearly identified Scope of survey will minimize the imposition on respondents As short as possible, asked to as few as possible Appropriate survey expertise in involved in designing the study Allow ample time for coordination, consultation, design and pre-testing Shared understanding of study limitations Start small and manage expectations

How to Survey 1.Develop a reasonable timeline 2.Define your research questions 3.Design the survey instrument and sampling plan 4.Develop a data security plan 5.Notify and secure approvals 6.Data analysis and reporting

1. Develop a reasonable timeline Putting questions on the web, collecting responses, and even data analyses are the easy parts Having established a clarity of task, four to six months is not an unreasonable amount to develop and pre-test a survey instrument

4. Develop a data security plan Anonymous Data Identities of respondents are never captured Fewer data security concerns Impossible to know who responded (or how often) Impossible to link survey data with other data sources Confidential Data Identities of respondents are kept, but secured Necessary to have very secure file storage Possible to link survey data to other data of interest

5. Notify and Secure Approvals IRP Survey Calendar Notification courtesy Institutional Review Board If a project is “research” (i.e. develops or contributes to generalizable knowledge), it needs to be reviewed and approved by the IRB.IRB Data Stewards Students: the Office of the RegistrarRegistrar Student and Academic Services SAS-Research Group

Resources on campus Find the help you need Institutional Research and Planning (IRP) oServes the University oStewards of much existing student survey data oAvailable to consult with institutional studies (sampling plans, instrument design and review) Survey Research Institute (SRI) oComprehensive survey services from initial planning to data analysis and reporting oIRP uses SRI for survey hosting and administration Institutional Review Board for Human Participants oFirst and last authority on mandated review requirements and processes – ask them.