Presentation is loading. Please wait.

Presentation is loading. Please wait.

Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington Using NSSE to Answer Assessment Questions Regional User’s Workshop October.

Similar presentations


Presentation on theme: "Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington Using NSSE to Answer Assessment Questions Regional User’s Workshop October."— Presentation transcript:

1 Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington Using NSSE to Answer Assessment Questions Regional User’s Workshop October 2005

2 Overview  Why should “engagement” be assessed  Assessment Techniques with NSSE data  Group Exercise and Discussion “NESSIE”

3 Why should engagement be assessed? Because individual effort and involvement are the critical determinants of college impact, institutions should focus on the ways they can shape their academic, interpersonal, and extracurricular offerings to encourage student engagement. Pascarella & Terenzini, How College Affects Students, 2005, p. 602

4 Who says engagement is important? Quality of Effort (Pace) Student Involvement (Astin) Social and Academic Integration (Tinto) Good Practices in Undergraduate Education (Chickering & Gamson) Student Engagement (Kuh)

5 Assessment Approaches  Normative - compares your students’ responses to those of students at other colleges and universities.  Criterion - compares against a predetermined value or level appropriate for your students, given your institutional mission, size, curricular offerings, funding, etc.  Longitudinal – compare your average scores over time

6 Assessment with NSSE Data  Descriptive displays of engagement patterns by any number of student characteristics  Use individual items and/or scales  Year-to-year tracking of student engagement  Multivariate models for retention, degree attainment, grades, other outcomes  Special peer comparisons with aspirational, regional, and mission-related institutions

7 Descriptive Analysis  Comparisons by Student Background  Minority Students  First Generation College Student  Comparisons by Enrollment Characteristics  Greek  Athletes  College and/or Department

8 Approaches to Descriptive Analysis Most valued activities What is most valued at your institution, in departments, what does the data show? Investigate “Nevers” Work on reducing or eliminating reports by students of never doing specific engagement activities. How much variation? Box & Whiskers

9 Descriptive Analysis Responses of Seniors by Major

10 Descriptive Analysis Responses of Seniors by Major

11

12 Descriptive Analysis

13 T-test: p<.000; Effect Size: -.29 Descriptive Analysis Seniors Scale Scores by Transfer Status

14 Variations in Student-Faculty Interaction by Discipline

15 Data Consideration: Disaggregating Results  Experience indicates that survey results are most likely to be used when the results are disaggregated by specific program or unit (e.g., college or department).  Targeted oversamples of specific units may be warranted.  Sampling error statistics may not be a good indicator of data quality with smaller units.

16 Comparisons Across Years FY Student Responses to Stu-Fac Items by Year

17 Comparisons Across Years FY and Senior Stu-Fac Scale Scores by Year

18 Comparisons Across Years FY Scores on Four Scales by Year

19 FY Student t-test Comparisons 2003 and 2004 at Nesseville State

20 Regression on Student-Faculty Interaction with Year

21 Multivariate Modeling Regression model predicting grades at the end of the first year.

22 Multi-equation Modeling A structural equation model explaining longitudinal relationships that lead to FY grades. Pre-collegeEngagementOutcome

23 Special Peer Comparisons Selecting a peer group By mission By size By department By Race By Locale Current or Aspirant Peers

24 Special Peer Comparisons Standard Frequency Report with Selected Peer Group

25 Special Peer Comparisons Carnegie Group Living on- campus Commuters

26 Special Peer Comparisons Student Level Benchmark Report

27 Special Peer Comparisons: Student Distributions First-year academic challenge scores Are these two schools the same? Same median benchmark score Different range of scores

28 Data Considerations  Standard error of mean (precision of estimate)  Non-response bias  Weighting your sample to look like the population  Comparability of survey items year-to-year  Use other assessment techniques (i.e., focus groups, other surveys) to validate your findings—NSSE is but one source of assessment information

29 NSSE Consortium  6 or more institutions sharing comparative data  Great way to add value to participation  Often times mission specific  Ability to ask additional questions Select Consortia Urban Institutions Women’s Colleges Private Liberal Arts Research Universities HBCUs Christian Colleges Jesuit Institutions State Systems

30 Sample Consortium questions

31 Assessment Exercise : Department-Level Analysis  Scenario  Nesseville State University is preparing for an upcoming accreditation related to its engineering program  The college was encouraged to incorporate more “student voice” into their educational outcomes assessment  The University Provost and College Dean have worked to increase buy-in for using NSSE to collect information

32 Assessment Exercise : Department-Level Analysis  Concerns to Address  Faculty are concerned that the Engineering College places too little emphasis on challenging and engaging pedagogical practice  The Dean is concerned that some of the departments are not preparing their students for life after graduation as well as others  The Provost would like to know how NSU engineering students compare to Engineering students nationwide  In previous Campus Surveys Engineering students have voiced dissatisfaction with their undergraduate experience

33 Assessment Exercise : Department-Level Analysis  Building the Analysis  In submitting their population file, Nesseville State University included an extra variable to identify Engineering students and their departments within the College  Nesseville State indicated that they wished to oversample all Engineering seniors not identified for the random institutional sample  NSU constructed several NSSE student-level scales to use as a basis for their analysis, as well as requested a special analysis from NSSE to get normative data

34 Assessment Exercise : Department-Level Analysis  What are some patterns that are evident in these results?  Were the expressed stakeholder concerns confirmed?  What differences are notable among departments?  What are some other sources of data that would be ideal to shed light on these results?  What additional analyses would you want to conduct?

35 Using NSSE to Answer Assessment Questions Shimon Sarraf Research Analyst Indiana University Center for Postsecondary Research 1900 East 10th Street Eigenmann Hall, Suite 419 Bloomington, IN 47406 Ph: 812-856-2169 ssarraf@indiana.edu www.nsse.iub.edu


Download ppt "Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington Using NSSE to Answer Assessment Questions Regional User’s Workshop October."

Similar presentations


Ads by Google