NOW WHAT? Charting Your Course through Using NSSE Data Regional NSSE Users Workshop October 19-20, 2006.

Slides:



Advertisements
Similar presentations
The Role of Academic Leadership in Student Success August 21, 2012 Deans and Department Chairs` Dialogue Southern Utah University Charles Schroeder, Consultant.
Advertisements

Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
2008 National Survey of Student Engagement – SUNY Oneonta Patty Francis Steve Perry Fall 2008.
Maximizing Your NSSE & CCSSE Results
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
Academic Vision, Whitworth College. Five Strategic Objectives Improve on excellence in teaching, learning, and scholarship. Advance the intercultural.
2012 National Survey of Student Engagement Jeremy D. Penn & John D. Hathcoat.
Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William.
Shimon Sarraf, Research Analyst Center for Postsecondary Research, Indiana University Bloomington Session for NSSE “Veterans” Regional NSSE User’s Workshop.
Assessment matters: What guides might we use as individuals, teams and institutions to help our assessment endeavours? A presentation to Wolverhampton.
An Assessment Primer Fall 2007 Click here to begin.
MPI Mission Perception Inventory Answer Accreditation Standards on Institutional Mission With Results from the Mission Perception Inventory (MPI) Ellen.
Program Review: The Foundation for Institutional Planning and Improvement.
Basic Reports and Data Dissemination Strategies Regional Users’ Workshop October 6-7, 2005.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
The Academic Assessment Process
Promoting Student Engagement: Involving Students with NSSE Planning and Results William Woods University NSSE Users’ Workshop October 6-7, 2005.
1 Student Learning Assessment Assessment is an ongoing process aimed at understanding & improving student learning Formative Assessment – Ongoing feedback.
Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington Using NSSE to Answer Assessment Questions Regional User’s Workshop October.
NSSE Foundations: An Introduction to the National Survey of Student Engagement Regional Users Workshop October 19-20, 2006.
National Research Agenda to Support Transformation National Learning Infrastructure Initiative Focus Session June, 2003 Copyright Jillian Kinzie, 2003.
Understanding Validity for Teachers
BENCHMARKING EFFECTIVE EDUCATIONAL PRACTICE IN COMMUNITY COLLEGES What We’re Learning. What Lies Ahead.
Project Funded by: E-Outcomes Assessment Project: Technology Linking Assessment & Learning Luvai Motiwalla, Steven Tello & Kathryn Carter College of Management.
Connecting Work and Academics: How Students and Employers Benefit.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
2008 – 2014 Results Chris Willis East Stroudsburg University Office of Assessment and Accreditation Spring 2015
Institutional Assessment Day: Program-Level Alumni Survey Data August 19, 2014 Pat Hulsebosch Associate Provost – Office of Academic Quality Rosanne Bangura.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Overall Teacher Judgements
1 National Survey of Student Engagement (NSSE) 2013 Tiffany Franks Assistant Director, Office of Planning & Analysis October 25, 2013.
Jeannie Couper, MSN, RN-BC Seton Hall University May 9, 2012.
Student Engagement Survey Results and Analysis June 2011.
Presentation of Results NSSE 2003 Florida Gulf Coast University Office of Planning and Institutional Performance.
Student Engagement at Towson: NSSE 2005 Telling and Selling the Story Kathryn Doherty, Ed.D. January 11, 2006.
MPI Mission Perception Inventory Institutional Characteristics and Student Perception of Mission: What Makes a Difference? Ellen Boylan, Ph.D. ● Marywood.
An Introduction: NSSE and the Concept of Student Engagement.
Stages of Commitment to Change: Leading Institutional Engagement Lorilee R. Sandmann, University of Georgia Jeri Childers, Virginia Tech National Outreach.
Focus on Learning: Student Outcomes Assessment and the Learning College.
Technology Standards in Teacher Education Proficiencies and Assessment Ellen Hoffman Eastern Michigan University MDE Workshop October 10, 2003.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Camille Kandiko, Indiana University Bloomington Jon Acker and William Fendley, The University of Alabama Lawrence Redlinger, The University of Texas at.
Incorporating Student Engagement into the Accreditation Process April 11, 2010.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
APSU 2009 National Survey of Student Engagement Patricia Mulkeen Office of Institutional Research and Effectiveness.
Research Utilization in Nursing Chapter 21
National Survey of Student Engagement 2009 Missouri Valley College January 6, 2010.
Sharing SASSE Results in Your Institution Jillian Kinzie NSSE Institute Indiana University Center for Postsecondary Research.
ACCREDITATION Goals: Goals: - Certify to the public and to educational organizations that the school is recognized as an effective institution of learning.
Using National Studies of Student Engagement to Support Institutional Change Nathan Marti, CCSSE Todd Chamberlain, NSSE FAIR Conference June 23, 2004.
Student Engagement as Policy Direction: Community College Survey of Student Engagement (CCSSE) Skagit Valley College Board of Trustees Policy GP-4 – Education.
Highlights of NSSE 2001: University of Kentucky December 10, 2001.
Instrumental Behaviors Following Test Administration and Interpretation: Exploration Validity of the Strong Interest Inventory Presented by: Ana AriasAlheli.
Student Affairs Assessment Council Wednesday, October 28, 2015.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
Gallaudet University 2015 There’s No Place Like Home: Assessing Climate Prepared by OAQ/Office of Institutional Research October 20,
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
SNU HLC/NCA Accreditation Update SNU Graduate & Professional Studies Fall Meeting October 24, 2008.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
The Boldness of Students on Social Media: A New Way to Facilitate Classroom Discussion Robert Turick and Dr. Trevor Bopp (Advisor), University of Florida.
Closing the Experience Gap March 30, 2017
NSSE Results for Faculty
AdvancED Accreditation Timeline for Huntsville City Schools
THE JOURNEY TO BECOMING
Enhancing Accountability Alabama’s Colleges and Universities
Presentation transcript:

NOW WHAT? Charting Your Course through Using NSSE Data Regional NSSE Users Workshop October 19-20, 2006

 Where do I start? This binder is enormous!  There are skeptics on my campus  Are these really high-quality data?  The Facilitator’s Guide in Action  Sharing your data  Acting on your data Agenda

Where do I start?  What burning questions do you have about your students?  What are the hot topics on your campus?  How can these data feed you with information on those topics?

Getting to Know NSSE Data  Respondent Characteristics  Comparative data [National, Carnegie, Selected Peers]  Means Comparison  Frequency Distribution  Benchmark Data  Data File

What do these data indicate? With whom might you share these results? What might you want to communicate? What implications do you see for assessment and retention? Pretend These Data Are Yours

How do I deal with skeptics? Skeptics tend to ask about the following:  Why are we administering this survey?  Validity and reliability  What is the research foundation?  Others?

Assessment Strategies  “The institutional attitude should encourage organizational constituents to ask – about all policies, programs, and procedures – ‘What is our source of evidence for that assertion?’” (Wolfe & Harris, 1994, p. 276).  “Examine, share, and act on assessment findings” (Palomba & Banta, 1999, p. 14) Wolff, R.A. & Harris, O.A. (1994). Using Assessment to Develop a Culture of Evidence (pp. 271–288). In: D. Halpern, Ed, Changing College Classrooms: New Teaching and Learning Strategies for an Increasingly Complex World. San Francisco: Jossey-Bass. Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education. San Francisco: Jossey-Bass. Why?

Focus Assessment on What Matters in College: Student Engagement Because individual effort and involvement are the critical determinants of college impact, institutions should focus on the ways they can shape their academic, interpersonal, and extracurricular offerings to encourage student engagement. Pascarella & Terenzini, 2005, p. 602 Why?

How do I deal with skeptics? Skeptics tend to ask about the following:  Why are we administering this survey?  Validity and reliability  What is the research foundation?  Others?

Validity of Self-Reported Data  Self-reported data is valid if five conditions are met: 1.Information is known to respondents 2.Questions are phrased clearly & unambiguously 3.Questions refer to recent activities 4.Respondents think the questions merit a serious & thoughtful response 5.Answering the questions does not threaten or embarrass students, violate their privacy, or encourage them to respond in socially-desirable ways  NSSE was intentionally designed to satisfy these five conditions

Does the instrument yield valid information?  Survey items…  Are clearly worded  Are well-defined  Have high content and construct validity  Relationships exist between items that are consistent with objective measures and other research  Responses are normally distributed  Patterns of responses are consistent both within and across major fields and institutions

Data Quality  Random sampling from similar population types  Response rate  39% in 2006  Sampling error  An estimate of the margin likely to contain your “true” score, for example:  If 60% of your students reply “very often” and the sampling error is ±5%, it is likely that the true value is between 55% and 65%  More respondents --> smaller sampling error

How do I deal with skeptics? Skeptics tend to ask about the following:  Why are we administering this survey?  Validity and reliability  What is the research foundation?  Others?

Research Foundation  Student Engagement: based on the work of George Kuh and C. Robert Pace  Current exemplars:  Student Success in College: Creating Conditions that Matter (Project DEEP)  Assessing Conditions to Enhance Educational Effectiveness: The Inventory for Student Engagement and Success

Communicating your Results  It’s your call on how to do this!  Sample PowerPoint presentation in binder  Focus groups  Faculty  Students  Facilitator’s guide can help you get organized

“NSSE is a great way to stimulate reflection and debate about what we do more and less well, and why. For us it’s proving an exciting and enlivening tool for self-reflection and self- improvement.” --Michael McPherson, President of The Spencer Foundation (former President of Macalester College) Sharing NSSE Results…Stimulating Conversation on Campus

Communicating Results - INTERNAL Internal Sharing of NSSE 2005 Results% President81 Faculty74 Administrative Staff74 Department Chairs64 Academic Advisors49 Governing Board35 Students31 Other (web site, fact book, etc.)18

Communicating Results - EXTERNAL External Sharing of NSSE 2005 Results% Accreditation Agencies34 No External Disclosure24 Web Site22 Other18 State Agencies14 Media13 Prospective Students11 Parents11 Alumni10

Using your Data  Benchmarking  Normative approach  Criterion approach  Accreditation Preparation (see the Toolkit in your binder or on our Web site)  Link to institutional data  Link to other survey data

The Facilitator’s Guide in Action  Simply reporting results may not lead to action  The Facilitator’s Guide is an instructor’s manual and can help you get organized  Provides suggestions for leading a workshop or session on understanding, interpreting and taking action on NSSE data

The Facilitator’s Guide in Action  Topic 1: Respondent Characteristics  Topic 2: Mean Comparisons  Topic 3: Pattern Analysis  Topic 4: Frequency Distributions  Topic 5: Benchmark Comparisons  Copies available:

The Facilitator’s Guide in Action  Exercise 4: Frequency Distributions  Good for an internal view of your own students  Look at the “never” responses. What percentages are too high? Do these make sense? What does this tell us?  What other encouraging or challenging patterns do you see in the frequency reports

Making Sense of Data: Two Approaches Most valued activities What is most valued at your institution, in departments, what does the data show? Eliminate “Nevers” Work on reducing or eliminating reports by students of never doing specific engagement activities.

The Facilitator’s Guide in Action  Exercise 5: Benchmark Reports  Good for external comparison purposes  What are the patterns here?  What are our strong points? Challenges?  How does our institution perform, given our student and institutional characteristics?  How does our institution compare, given our student and institutional characteristics?

Making Sense of Data: Benchmarking Two Approaches:  Normative - compares your students’ responses to those of students at other colleges and universities.  Criterion - compares your school’s performance against a predetermined value or level appropriate for your students, given your institutional mission, size, curricular offerings, funding, etc.

Discussion and Questions Rob Aaron Client Services Manager Indiana University Center for Postsecondary Research 1900 East 10th Street Eigenmann Hall, Suite 419 Bloomington, IN Ph: Fax: Web site: