The Survey as an Assessment Method: Why, when and how surveys provide evidence to inform decision-making Carrie Towns, Office of Institutional Research and Planning and the Information Services Assessment Council April 13, 2006
assessment an ongoing process in which services, resources and performance are measured against the expectations of users, and improvements are made to satisfy user needs effectively and efficiently.
What do we need to know?
Who can tell us?
How can we get the information?
What will it enable us to do?
How much will it cost?
Definition of Survey An assessment tool/system for collecting information that is used to describe, compare, and explain knowledge, attitudes, and behavior of a defined group or groups of respondents.
Conduct a Survey When: Need input from a large, well-defined group. Have a set of focused questions developed to meet a specified objective. Results will inform specific decisions. Time and other resources permit.
Characteristics of a Good Survey Specific objectives Straightforward questions Sound research design Appropriate resources
Types of Surveys Information gathering attitudes/opinion/behavior Market research tools Public relations tools Educational tools
Survey Process Set survey context – specify objectives Establish target audience – sampling frame Determine mode Prepare cost estimate Establish tentative calendar Design survey
Survey Process cont. Develop questions Plan for analysis Field test Administer survey Summarize and interpret data Report results
Survey Context – Specify Objectives Who is asking for the information? What do they want to learn? How will the information be used? Starting point – existing instrument?
Establish Target Audience/Sampling Frame Who can provide the information? Type of analysis needed – how detailed Number of subgroups of interest Plans for follow-ups
Modes of Surveys Web Paper and pencil Telephone Focus Groups
Prepare Cost Estimate OIRP professional time $25/hour Commercial tools and associated costs Other costs
Establish Tentative Calendar Development Field testing Administration Programming and analysis Reporting
Design Survey Length and layout Organization Question formats
Characteristics of Good Questions Make sense to the respondent Are concrete Use conventional language Avoid emotionally-charged language Avoid negative phrasing Ask for only one piece of information Have a specific purpose
Plan for Analysis Presentation of data drives item development Different audiences require different level of analysis
Field Test/Revise and Fine Tune Instrument Test drive instrument Administer Survey Data collection phase
Summarize Data Examine the data Run preliminary analysis Dig in – interpret and draw conclusions Report Results Written/oral
Information Services Assessment Council members Susanne Clement, Libraries Jill Glaser, IT Ryan Papesh, NTS Thelma Simons, IT John Stratton, Libraries Bill Myers, IS
Call on ISAC members to: Consult, advise and assist in the development of assessment initiatives. Identify other campus resources for assessment-related services. Provide oversight and assure coordination with other IS assessment activities.