Director, Institutional Research Using Community College Survey of Student Engagement (CCSSE) for Institutional Improvement Daniel Martinez, PhD Director, Institutional Research 2/27/2014
CCSSE Background Started in 2001 Research Based Student & Faculty Surveys Why student engagement? Learning Retention
CCSSE Survey at COD Administered in 2008, 2010, and 2012. Sample Size Headcount 2008 895 10,255 2010 916 11,039 2012 1,021 9,579
CCSSE: A Tool for Improvement There are 3 ways to use CCSSE: Benchmarks Direct Information Institutional measures
CCSSE Benchmarks
CCSSE Benchmarks for Effective Educational Practice The five CCSSE benchmarks are: - Active and Collaborative Learning - Student Effort - Academic Challenge - Student-Faculty Interaction - Support for Learners
Benchmarking – and Reaching for Excellence The most important comparison: Where you are now, compared with where you want to be. CCSSE offers five ways that colleges can use benchmarks to better understand their performance — and to reach for excellence. They can: Compare their performance to that of the national average — and at the same time, resist the average. Comparing themselves to the average of participating colleges (the 50 mark) is a start. But then colleges should assess their performance on the individual survey items that make up the benchmark. Most colleges will find areas for improvement at the item level. Compare themselves to high-performing colleges. A college might, for example, aspire to be at or above the 85th percentile on some or all benchmarks. Colleges also can learn by examining the practices of high-performing peers. Measure their overall performance against results for their least-engaged group. A college might aspire to make sure all subgroups (e.g., full-time and part-time students; developmental students; students across all racial, ethnic, and income groups; etc.) engage in their education at similarly high levels. Examine areas that their colleges value strongly. They might focus, for example, on survey items related to service to high-risk students or on those related to academic rigor (e.g., are they asking students to read and write enough?). Make the most important comparison: where they are now, compared with where they want to be. This is the mark of an institution committed to continuous improvement.
COD: 2008-2012
COD: 2008-2012
COD: 2008-2012
Active and Collaborative Learning
Student Effort
Academic Challenge
Students-Faculty Interaction
Support for Learners Survey results reveal both areas in which colleges are doing well and areas for improvement in creating multiple, intentional connections with students, beginning with the first point of contact with the college. For example, nearly nine in 10 SENSE respondents (88%) agree or strongly agree that they knew how to get in touch with their instructors outside of class. But more than two-thirds (68%) of SENSE respondents and 47% of CCSSE respondents report that they never discussed ideas from readings or classes with instructors outside of class. These results clearly indicate opportunities for colleges to increase their intentionality in seeking to build meaningful connections with students.
Active and Collaborative Learning
Student Effort
Academic Challenge
Student-Faculty Interaction
Support for Learners
Select Comparisons: COD vs. Others
Select Comparisons: COD vs. Others
Select Comparisons: COD vs. Others
Select Comparisons: COD vs. Others
Select Comparisons: COD vs. Others
Select Comparisons: COD vs. Others
Select Comparisons: COD vs. Others
Select Comparisons: COD vs. Scorecard Colleges
Select Comparisons: COD vs. Scorecard Colleges
Select Comparisons: COD vs. Scorecard Colleges
Select Comparisons: COD vs. Scorecard Colleges
Select Comparisons: COD vs. Scorecard Colleges
Select Comparisons: COD vs. Scorecard Colleges
Select Comparisons: COD vs. Scorecard Colleges
Direct Information
Academic advising/planning (2012) Direct information is data taken from the survey itself. For example, questions 13a-k measure the frequency of use, satisfaction with, and importance of various student services. This is an example of one of the questions.
Institutional Measures
The questions on the CCSSE have been mapped to the WASC-ACCJC accreditation standards by CCSSE. However, this is their interpretation and is not a hard-and-fast rule. Questions can be added or deleted and standards not addressed here could be mapped to CCSSE if the institution felt it was a good match. This is a good way to see what standards are addressed via the CCSSE and by that token, which standards may need alternative forms of data/evidence. What this also means is that other college goals may be supported via the CCSSE. The questions just need to be mapped to the goal.
IIA1a When questions are mapped to a goal, the measures can be standardized for comparison purposes. For instance, CCSSE says that accreditation standard IIA1a is measured using the following survey items: 4l, 4o, 8b,8d, 8e, 8f, 8g, 8h, 8i, 9b, 9d, 12a,12b, 12c, 12d, 12e, 12f, 12g, 12h, 12i, 12n, 12o, 13d2&3, 13e2&3, 13h2&3, 13k1&2&3, 14c, 17a, 17b, 17d, 17e, 17f, 22. Not all of the items are on the same metric. When all of the responses to these items are added together, the range of scores is 33-130. These various scales can be normalized (that is, recoded to range from 0 to 1) using the following formula: Normalized score = (raw score-[minimum score])/([maximum score] – [minimum score]). In this example, it would be (raw score-33)/(130-33). The mean raw score on this measure for 2010 and 2011 was 79.48 and 79.79, respectively. The normalized score, shown here, is 47.90 and 48.34, respectively.
Critical Thinking Analyzing and solving complex problems 5b. Analyzing the basic elements of an idea, experience, or theory 5e. Applying theories or concepts to practical problems or in new situations. Constructing sound arguments and evaluating the arguments of others 5d. Making judgments about the value or soundness of information, arguments, or methods Considering and evaluating rival hypotheses Recognizing and assessing evidence from a variety of sources Generalizing appropriately from specific cases Integrating knowledge across a range of academic and everyday contexts 4d. Worked on a paper or project that required integrating ideas or information from various sources 5c. Synthesizing and organizing ideas, information , or experiences in new ways Identifying your own and others’ assumptions, biases, and their consequences OVERALL 12e. Thinking critically and analytically The questions on the CCSSE have been mapped to the WASC-ACCJC accreditation standards by CCSSE. However, this is their interpretation and is not a hard-and-fast rule. Questions can be added or deleted and standards not addressed here could be mapped to CCSSE if the institution felt it was a good match. This is a good way to see what standards are addressed via the CCSSE and by that token, which standards may need alternative forms of data/evidence. What this also means is that other college goals may be supported via the CCSSE. The questions just need to be mapped to the goal.
Next Steps?