Download presentation
Presentation is loading. Please wait.
Published byJulie Marlene Webb Modified over 8 years ago
1
Reading the COACHE Report November 4 & 5, 2015 CC105
2
WHAT IS THE COACHE SURVEY?
3
The COACHE survey gathers information from faculty on their experiences, perceptions, and views in the following work life areas: Research, teaching, service Resources in support of faculty work Benefits, compensation, and work/life Interdisciplinary work and collaboration Mentoring Tenure and promotion practices Leadership and governance Departmental collegiality, quality, engagement Appreciation and recognition
4
The COACHE Team Amy Wolfson, Ph.D. Academic Vice President The COACHE Advisory Team: Jeff Barnett, PsyD, ABPP Professor of Psychology and Associate Dean LCAS Kathy Forni, Ph.D. Professor of English and Chair of the Faculty Affairs Committee Lorie Holtgrave, M.A. Director of Budgets and Data Management; Office of Academic Affairs Brian Norman, Ph.D. Associate Professor of English and Associate VP for Faculty Affairs and Diversity
5
Use of the Survey Results The COACHE survey provides peer comparisons among the 112 participating institutions and this year Loyola selected: College of the Holy Cross, Franklin and Marshall College, Gonzaga University, Providence College, and University of Richmond. It provides a sense of change over time in comparison to Loyola’s 2011 survey results.
6
Use of the Survey Results (cont.) Use of the data over time as part of an ongoing process. Comparison to past COACHE survey data for faculty perceptions of change, strengths, and areas of need. Comparison of data to comparison institutions and similar size schools. Dissemination of the data to all faculty members as well as specifically targeted groups.
7
An Ongoing Collaborative Process Ongoing discussions among the faculty with additional feedback provided to the leadership. Plans for addressing areas of concern. Ongoing work and new initiatives to address areas of need. Continued targeted efforts to continue building on strengths. Additional COACHE surveys in the future.
8
The COACHE survey results, resources, and instructions are available at: http://www.loyola.edu/department/fdd/work/coache The COACHE survey is intended to be a diagnostic and comparative management tool for college and university leaders. The goal is to utilize these results to engage in ongoing discussions and to develop and implement plans to continually improve satisfaction in each of the work life areas listed earlier for all faculty members.
9
The COACHE Report Results are provided in the Provost’s Report, the COACHE Dashboard, the COACHE Digital Report Portfolio, and the accompanying Supplemental Materials.
10
Provost’s Report A quick, at-a-glance review of the survey results overall. Response rates are provided for Loyola University Maryland as well as for the five comparison institutions. Each column represents the range of institutional means.
11
Reading the Provost’s Report Our mean score for each dimension is marked with a black diamond. ♦ Other institutions mean scores are marked with a black open circle. ○ The distribution of the entire cohort is represented by red, grey, and green boxes. (See on next slide.) A score in the red section indicates that this score falls within the bottom 30 percent of all institutions. A score in the green section indicates that this score falls within the top 30 percent of all institutions. A mark in the grey section indicates that this score is a middle-of-the-road result.
14
COACHE identifies an area as a strength for LUM if it falls in the top two of our comparison institutions and it also falls within the top 30 percent of all institutions. COACHE identifies an area as an area of concern for LUM if it falls in the bottom two of our comparison institutions and if it also falls within the bottom 30 percent of all institutions.
15
The COACHE Dashboard For each of the nine areas assessed, two adjacent triangles are present to compare our faculty members’ ratings to those of our selected comparison institutions’ (to the left) and to the entire cohort (to the right). Red triangles indicate areas of concern relative to the indicated comparison group. Green triangles indicate areas of strength relative to the indicated comparison group. Grey triangles suggest unexceptional performance relative to the indicated comparison group. Empty triangles indicate insufficient data to make meaningful comparisons.
16
Basic Overview of Data Type Raw data/Likert numbers Green/Red Arrows Peer/National comparison arrows Longitudinal change from 2011 Ratio questions (percentages, pie charts, etc.)
19
To further understand these comparisons, effect sizes of small, moderate, and large are indicated. Insignificant effect sizes are left blank. Following the COACHE Dashboard, a dashboard with individual item responses is provided. This allows for a more detailed and nuanced analysis of the results. Additionally, comparisons to COACHE data from prior years are provided. Improvements over time are indicated with a plus-sign (+) and declines over time are indicated with a minus sign (-). These data are only provided when the same question was asked in the past and again at present.
20
Other Data Provided Additional data are provided separately for those items that do not use a five-point Likert scale or for other reasons do not provide a mean score for comparison purposes. Examples include when faculty members are asked to list the two best and worst aspects of working at LUM, selected from a list of common characteristics of the academic workplace. Additionally, a thematic analysis is provided for responses to open-ended questions and demographic characteristics of the respondents to the survey are provided.
21
The COACHE Digital Report Portfolio Provides access to an online tool for survey data analysis and the mean comparisons and frequency distributions for all survey results overall, by tenure status, rank, gender, and race/ethnicity. Means and frequency distributions are provided for LUM, our selected comparison institutions, and for all institutions participating in the survey. These should each be considered in interpreting the data.
22
Helpful Framing Questions for Conversations First takes: What is surprising? How are the data consistent with your perceptions of our institution? Group trends: Are there significant differences in the perception of some faculty (by gender, rank, tenure status, or within divisions) that create opportunities or raise concerns? Priorities and Comparisons: Considering the current circumstances at Loyola, are some ingredients or areas more important than others?
23
Questions to Consider in Reviewing the Data and to Guide Decisions Moving Forward What has the biggest impact on our lives as faculty? Are there areas of strength we want to nonetheless improve or cultivate in comparison to peer institutions, perhaps to be distinct? Are there areas of relative weakness or dissatisfaction that nonetheless don’t warrant our attention right now? Is there one item that can be addressed relatively quickly that you would prioritize? Is there one item that would be your top priority, even if it requires focused effort over time or additional resources?
24
Worth a Closer Look In addition to major trend lines, sometimes the data merit a closer look to get the full picture and to ask targeted questions. You might call these outliers or “wiggly data.” Some examples: Huh? Moments: one-time surprises double-red arrows > 3.00: satisfaction but less so than peers double-green arrows < 3.00: dissatisfaction but less so than peers Mixed Bag: red-green pairings may point to peer-specific differences from national trends Hot-button: issues of particular concern at time of survey Divergences: group-specific variance on area of strength or weakness Long arcs: Modest changes over time may point to something big Blips that Count: Small questions may point to something larger
25
Closer Look Example: Midcareer Mentoring
26
Next Steps & Conclusions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.