S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT TO SURVEY OR NOT TO SURVEY? THAT’S A REALLY GOOD QUESTION! It is tempting, if the only tool you have is a hammer,

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
What “Counts” as Evidence of Student Learning in Program Assessment?
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
© Cambridge International Examinations 2013 Component/Paper 1.
Student Learning Outcomes Curriculum Change Request Academic Council Presentation Gary Howard Rosemary Hays-Thomas October 22, 2004.
What makes great teaching?
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
NOW WHAT? Charting Your Course through Using NSSE Data Regional NSSE Users Workshop October 19-20, 2006.
Recreational Therapy: An Introduction
UniLOA The University Learning Outcomes Assessment The Center for Learning Outcomes Assessment, Inc. ©
SURVEYS, OBSERVATIONS, AND RUBRICS OH MY! ASSESSING CAREER SERVICES Jessica M. Turos Bowling Green State University Career Center.
Understanding Validity for Teachers
Formulating the research design
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Copyright © 2014 by The University of Kansas Choosing Questions and Planning the Evaluation.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Developing Evaluation Instruments
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Thinking Actively in a Social Context T A S C.
Janet Fulks, ASCCC Bakersfield College Bob Pacheco, RP, Barstow College.
Program Evaluation Using qualitative & qualitative methods.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
MPI Mission Perception Inventory Institutional Characteristics and Student Perception of Mission: What Makes a Difference? Ellen Boylan, Ph.D. ● Marywood.
S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT ASSESSMENT ON THE GROUND What you need to know when conducting assessment or working with an assessment professional.
Student Learning Outcomes: Interpretations, Validity, and Factor Development Krista Soria and Laura Gorny This project was funded by the Undergraduate.
AAHE 2004 Connecting Public Audiences to the College Experience: A Model of General Education Assessment Susan L. Davis James Madison University A. Katherine.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT THE ROLE OF ASSESSMENT IN STUDENT AFFAIRS PRACTICE CSA 502: Organization and Administration in Student Affairs.
EDU 385 Education Assessment in the Classroom
Evaluating a Research Report
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
Paradigmatic issues Positivist and critical/interpretive research Qualitative and quantitative research Induction and deduction Experimental and non-experimental.
Page 1 SURVEY RESEARCH. Page 2 Survey research a research method involving the use of questionnaires and/or statistical surveys to gather data about people.
Learning Targets, Change Rationale. The Task at Hand ► Challenging times call for people to step forward - try new things, lead new innovation, look at.
Quality Assessment July 31, 2006 Informing Practice.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 13 Assessing Affective Characteristics.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
Introduction to Survey Research. Survey Research is About Asking Questions About…  Behaviors  Opinions/Attitudes  Facts  Beliefs  There are lots.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Data Collection and Reliability All this data, but can I really count on it??
8. Observation Jin-Wan Seo, Professor Dept. of Public Administration, University of Incheon.
SURVEY RESEARCH AND TYPES OF INFORMATION GATHERED.
EXPLORING THE EFFECTIVENESS OF RCR EDUCATION IN THE SOCIAL AND BEHAVIORAL SCIENCES Jim Vander Putten Department of Educational Leadership Amanda L. Nolen.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
EDUCATIONAL ASSESSMENT. DIAGNOSTIC ASSESSMENT IN EDUCATION The 2001 National Research Council (NRC) report Knowing What Students Know (KWSK) Cognitive.
WASHINGTON STATE UNIVERSITY EXTENSION Evaluation Based Program Development This material was developed in partnership with Rutgers's University and Lydia.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Student Affairs Assessment Council Wednesday, October 28, 2015.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Choosing Questions and Planning the Evaluation. What do we mean by choosing questions? Evaluation questions are the questions your evaluation is meant.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Monitoring and Assessment Presented by: Wedad Al –Blwi Supervised by: Prof. Antar Abdellah.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
NSSE Results for Faculty
ASSESSMENT OF STUDENT LEARNING
Qualitative and Quantitative Data
Derek Herrmann & Ryan Smith University Assessment Services
Lesson 5. Lesson 5 Extraneous variables Extraneous variable (EV) is a general term for any variable, other than the IV, that might affect the results.
Presentation transcript:

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT TO SURVEY OR NOT TO SURVEY? THAT’S A REALLY GOOD QUESTION! It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail. ~ Abraham Maslow, 1966 SARA Assessment Brown Bag February 10, 2011

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Why is assessment important? Helps determine if you are meeting your educational objectives Helps ensure that you have the resources you need Helps prioritize efforts Can contribute to our understanding of student learning and development

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Key steps - PDCA Identify outcomes Identify appropriate measures Choose an appropriate assessment method Choose and appropriate research design Collect the data Analyze the data Disseminate the findings Take action Wash, Rinse, Repeat!

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Presentation outline Why do you ask? Review types of outcomes and measures Benefits and drawbacks to surveys Other types of assessments It’s your turn - practice makes perfect!

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Why not use a survey? Ease of web-based surveys has lead to their proliferation Result is that surveys are the “go-to” research tool, but… ◦ Not every question is a nail ◦ Survey fatigue is a significant threat ◦ Multiple types of evidence (assessment triangulation) build a stronger case

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Starting questions 1. What type of outcome do you want to measure? ◦ Cognitive (knowledge) ◦ Affective (attitudes) 2. What type of data do you want to collect? ◦ Psychological (personal traits) ◦ Behavioral (observable activities) 3. Time frame? ◦ Short-term ◦ Long-term

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Taxonomy of Student Outcomes (Astin, 1993) T YPE OF D ATA T YPE OF OUTCOME CognitiveAffective PsychologicalSubject-matter knowledge Academic ability Critical thinking ability Basic learning skills Special aptitudes Academic achievement Values Interests Self-concept Attitudes Beliefs Satisfaction with college BehavioralDegree attainment Vocational achievement Awards or special recognition Leadership Citizenship Interpersonal relations Hobbies and avocations

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Time: Examples of Short- and Long- term Outcomes (Astin, 1993) T YPE OF O UTCOME T YPE OF D ATA S HORT -T ERM (D URING C OLLEGE ) L ONG -T ERM (A FTER C OLLEGE ) CognitiveBehavioral Completion of AlcoholEdu (vs. noncompletion) Exhibits responsible drinking behaviors CognitivePsychologicalMCAT score Score on medical licensing exam AffectiveBehavioral Participation in student government Involvement in local or national politics AffectivePsychologicalSatisfaction with collegeJob satisfaction

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Types of measures Direct measures Indirect measures Norm-referenced Criterion-referenced Self-referenced Although direct measures are typically preferred, practically speaking, your overall assessment plan should contain a mix of these

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Your question should guide your choice of assessment tool, and not the other way around!

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Surveys “Surveys represent one of the most common types of quantitative, social science research. In survey research, the researcher selects a sample of respondents from a population and administers a standardized questionnaire to them.”

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Surveys can be a good tool if you are interested in: Perceptions Beliefs Motivations Future plans Past behavior Private behavior

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT What about student learning? Research indicates that aggregate self-reports of learning can provide a reasonable estimation of actual learning. * **Self-reported data is most valid when: ◦ the information is known to the respondents, ◦ the questions are unambiguous and refer to recent activities, ◦ the respondents take the questions seriously, and ◦ responding has no adverse consequences nor does it encourage socially desirable, rather than truthful, answers. * (Anaya, 1999; Kuh, Kinzie, Schuh, Whitt, & Associates, 2005; Laing, Swayer, & Noble, 1989; Pace, 1985; Pike, 1995). **(Kuh et al., 2005 & Pike, 1995)

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Validity and reliability Surveys tend to be weak on validity and strong on reliability. The artificiality of the survey format puts a strain on validity. Since people's real feelings are hard to grasp in terms of such dichotomies as "agree/disagree," "support/oppose," "like/dislike," etc., these are only approximate indicators of what we have in mind when we create the questions. Reliability, on the other hand, is a clearer matter. Survey research presents all subjects with a standardized stimulus, and so goes a long way toward eliminating unreliability in the researcher's observations. Careful wording, format, content, etc. can reduce significantly the subject's own unreliability.

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Representativeness If your respondents are not representative of your population, then your results may be misleading ◦ Example: You want to survey all undergraduate students about their attitudes towards a student honor code. Your friend is in charge of the FYS program and offers to have your survey passed out in all first-year seminars. You’re excited to get this direct push for your survey, but how might this effect your results?

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Strengths of surveys Relatively inexpensive Can reach large numbers of people Large numbers allow for multivariate analyses Can ask many questions relatively quickly Standardized instruments (like NSSE) allow for comparisons between groups Can be confidential or anonymous Can have high reliability

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Strengths continued Can provide student/alumni/employers perspective of the institution/program. Can make respondents feel that their opinions matter. Ease of response can provide information from hard to reach individuals Results easily understood

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Weaknesses of surveys Questions have to be general enough to apply to all or most respondents Inflexible - Forced-response choice may not allow respondents to express their true opinions Require good response rates to achieve representative results May be hard for respondents to recall information or answer truthfully Can seldom deal with “context”

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Weaknesses continued Validity can be questionable – results tend to be highly dependent on wording of items, salience of survey, and organization of the instrument Socially desirable responses Indirect evidence which may have less legitimacy with stakeholders Better for measuring and comparing the responses of groups rather than individuals

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Finally, it might not be a survey if…. The questions you want to ask don’t have a limited number of known, well-defined possible answers You want to be able to ask about relationships rather than inferring them Your population of interest differs in culture or language from the majority

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Some other types of assessments Standardized exams Test of abilities or knowledge Simulation or performance appraisals Interviews and focus groups External examiners Archival records and transcript analysis Portfolios Behavior observations Student self-evaluations Reflective writing Minute papers/muddiest point

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Final thoughts on choosing an assessment tool Go back to your assessment question(s) ◦ What do you want to know? ◦ What are the resource limitations? (e.g., time, money, staff) ◦ One shot or longitudinal? ◦ Experimental design? ◦ What type of analysis is appropriate?

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT NOW IT’S YOUR TURN

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Think about a program you want to assess… What do you want to know? What type of measure? How many students are involved? What type of evidence do you already have (if any)? What type of evidence is most effective with your intended audience?

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Sample assessment questions Do participants in an alternative spring break (n=20) develop an increased awareness of social injustice and subsequently a greater commitment to working for social justice? Do students (n=5,000) who go through an alcohol intervention drink less as a result? Do participants in the PRCC’s Learning Circle (n=15) exhibit improved ability to engage in positive cross-racial interaction with other participants?

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT Resources Astin, A. W. (1993). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education. American Council on Education Series on Higher Education. Phoenix, AZ: Oryx Press. Improving Educational Programming online training module: Pope, R. L., Reynolds, A. L., Mueller, J. A., & Cheatham, H.E. (2004). Multicultural competence in student affairs. San Francisco: Jossey-Bass. SARA Assessment Resources website: Writing Guide: Survey Research, Colorado State University: Yin, A. C., & Volkwein, J. F. (2009). Assessing General Education Outcomes. In J. F. Volkwein (ed.), Assessing Student Outcomes: Why, who, what, how? New Directions for Institutional Research Assessment Supplement (pp ). San Francisco: Jossey-Bass.

S TUDENT A FFAIRS R ESEARCH AND A SSESSMENT QUESTIONS?