Reading the COACHE Report November 4 & 5, 2015 CC105.

Slides:



Advertisements
Similar presentations
Faculty Senate Faculty are consulted and given the opportunity for input on decisions that impact faculty members. – Faculty senate representatives.
Advertisements

Listening to the Future Presented by Larry Johnson and Kristi Nelson Transforming Lives, Education, and Knowledge.
Reporting tools Webinar Thursday 5 th September 2013.
NSF ADVANCE: Institutional Transformation for Faculty Diversity ADVANCE Faculty Work Life Survey: Comparison of Statistically Significant Gender Differences.
The AAMC Faculty Forward Engagement Survey. Why study engagement? Engagement: A heightened emotional and intellectual connection that a faculty member.
1 Report of the Financial Affairs Committee of the Faculty Senate Paul Bolster, CBA - Finance and Insurance Jackie Isaacs, COE - Mechanical and Industrial.
Noyce Program Evaluation Conference Thursday, December 6, 2007 Frances Lawrenz Michelle Fleming Pey-Yan Liou Christina Madsen Karen Hofstad-Parkhill 1.
Promotion and Ten ure October 21, 2014 S. Laurel Weldon Vice Provost for Faculty Affairs (Interim) PURDUE FACULTY.
Faculty Job Satisfaction Survey DATA COMPILED FROM THE COLLABORATIVE ON ACADEMIC CAREERS IN HIGHER EDUCATION (COACHE)
Monroe Community College 1 Assessment of General Education Social Sciences Knowledge and Skills Area Presenters Frances Dearing, Assessment Coordinator.
Examples of using SERU/UCUES Results and Strategies for Increasing Usage on Campus Ron Huesman, Discussion Chair.
Benchmarks from the Harvard Collaborative on Academic Careers in Higher Education (COACHE) Faculty Job Satisfaction Survey University Faculty Meeting October.
Data Compiled from the Collaborative on Academic Careers in Higher Education (COACHE)
COACHE Survey Results Faculty of Arts & Science March 28, 2014.
FACULTY PERFORMANCE REVIEW AND DEVELOPMENT: Key Issues and Challenges Penn State “Academic Leadership Forum” February 12, 2015 Presented by: Theodore H.
2010 Annual Employee Survey Results
University Senate August 26, 2014 KEY FINDINGS FROM THE COACHE FACULTY JOB SATISFACTION SURVEY.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Workforce Engagement Survey Accessing your survey results and focussing on key messages in the survey data.
1 National Survey of Student Engagement (NSSE) 2013 Tiffany Franks Assistant Director, Office of Planning & Analysis October 25, 2013.
COACHE Presentation LUCINDA FINLEY Vice Provost for Faculty Affairs.
Teacher Engagement Survey 2014
Stages of Commitment to Change: Leading Institutional Engagement Lorilee R. Sandmann, University of Georgia Jeri Childers, Virginia Tech National Outreach.
Promotion and Tenure Lois J. Geist, M.D. Associate Dean for Faculty Affairs and Development.
COACHE Faculty Job Satisfaction Survey: Associate Professors Associate Professors Community Meeting October 30, 2013 Nancy Whelchel, Associate.
Great Colleges to Work For Survey: 2013 Results
COACHE Faculty Job Satisfaction Survey Deans’ Council Meeting November 8, 2012 Betsy Brown and Nancy Whelchel.
STUDENT SERVICES REVIEW January 8, Context – Administrative Unit Reviews Objectives Roles Unit Self-Study Internal Review Committee External Reviewers.
University Planning: Strategic Communication in Times of Change Cathy A. Fleuriet Ana Lisa Garza Texas State University-San Marcos Presented at the July.
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
PROMOTION AND TENURE: THE LONG AND WINDING ROAD. WHAT ARE THE RANKS? WHAT DO THEY MEAN? ASSISTANT PROFESSOR ASSOCIATE PROFESSOR PROFESSOR –NOT THE “PHILOSOPAUSE”
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
Retention and Advancement for Mid Career Faculty K.D. JoshiKelly Ward Associate Professor of Interim Chair and Information Systems Professor, Education.
Faculty Satisfaction Survey Results October 2009.
PROMOTION AND TENURE: THE LONG AND WINDING ROAD. WHAT ARE THE RANKS? WHAT DO THEY MEAN? ASSISTANT PROFESSOR ASSOCIATE PROFESSOR PROFESSOR –NOT THE “PHILOSOPAUSE”
Faculty Well-Being Survey: Some Select Findings for Vice Provosts to Pique Curiosity in What the Data Can Tell Us Presentation for Vice Provosts.
A Capacity Building Program of the Virginia Department of Education Division Support for Substantial School Improvement 1.
Managing Promotion Timelines Anita L. Allen, JD, PhD Vice Provost for Faculty Henry R. Silverman Professor of Law and Philosophy.
COACHE Faculty Job Satisfaction Survey: Non-Tenure Track Faculty Non-Tenure Track Faculty Community Meeting October 14, 2013 Nancy Whelchel,
2008 COACHE Survey of Pre-Tenure Faculty Faculty Senate January 25, 2011 Betsy Brown and Nancy Whelchel.
Select Slides… Spring 2013 Training Strengthening Teaching and Learning through the Results of Your Student Assessment of Instruction (SAI) For Faculty.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
Human Resources Office of 1 Summary of Results College of Design Dean’s Reports.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Kapil Bawa, Ph.D., Professor of Marketing, Zicklin School of Business Micheline Blum, Director, Baruch College Survey Research, Distinguished Lecturer,
Faculty Well-Being Survey: Assessment Activities Presentation for the NC State Assessment Work Group May 2, 2007 Nancy Whelchel, PhD Assistant Director.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
© All rights reserved Your Voice, Your CC: The Colorado College Employee Climate/Engagement Survey Advancement.
COACHE Spring 2015 Faculty Satisfaction Survey Overview of Results Presentation to NC State Faculty Senate January 26, 2016 Katharine Stewart, VP for Faculty.
AAMC Faculty Forward Engagement Survey Results
LUCINDA FINLEY Vice Provost for Faculty Affairs
Director, Center for Teaching, Learning, & Assessment
Strategic Planning Council (SPC)Update
Promotion in Extension Presented by: Ken Martin, Ph. D
UTRGV 2016 National Survey of Student Engagement (NSSE)
The 2015 COACHE Survey YORK COLLEGE Faculty Satisfaction
COACHE Survey Results Monday, February 5, 2018
Derek Herrmann & Ryan Smith University Assessment Services
Governance and leadership roles for equality and diversity in Colleges
Your Institutional Report Step by Step
UA Workplace Experience Survey - Chime in!
UTRGV 2018 National Survey of Student Engagement (NSSE)
UTRGV 2017 National Survey of Student Engagement (NSSE)
Equity Strategic Framework
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
USG Dual Enrollment Data and Trends
College of Business Scorecard
College of Computing & Informatics Scorecard
College of Health & Human Services Scorecard
College of Education Scorecard
Presentation transcript:

Reading the COACHE Report November 4 & 5, 2015 CC105

WHAT IS THE COACHE SURVEY?

The COACHE survey gathers information from faculty on their experiences, perceptions, and views in the following work life areas: Research, teaching, service Resources in support of faculty work Benefits, compensation, and work/life Interdisciplinary work and collaboration Mentoring Tenure and promotion practices Leadership and governance Departmental collegiality, quality, engagement Appreciation and recognition

The COACHE Team  Amy Wolfson, Ph.D. Academic Vice President The COACHE Advisory Team:  Jeff Barnett, PsyD, ABPP Professor of Psychology and Associate Dean LCAS  Kathy Forni, Ph.D. Professor of English and Chair of the Faculty Affairs Committee  Lorie Holtgrave, M.A. Director of Budgets and Data Management; Office of Academic Affairs  Brian Norman, Ph.D. Associate Professor of English and Associate VP for Faculty Affairs and Diversity

Use of the Survey Results The COACHE survey provides peer comparisons among the 112 participating institutions and this year Loyola selected: College of the Holy Cross, Franklin and Marshall College, Gonzaga University, Providence College, and University of Richmond. It provides a sense of change over time in comparison to Loyola’s 2011 survey results.

Use of the Survey Results (cont.) Use of the data over time as part of an ongoing process. Comparison to past COACHE survey data for faculty perceptions of change, strengths, and areas of need. Comparison of data to comparison institutions and similar size schools. Dissemination of the data to all faculty members as well as specifically targeted groups.

An Ongoing Collaborative Process Ongoing discussions among the faculty with additional feedback provided to the leadership. Plans for addressing areas of concern. Ongoing work and new initiatives to address areas of need. Continued targeted efforts to continue building on strengths. Additional COACHE surveys in the future.

The COACHE survey results, resources, and instructions are available at: The COACHE survey is intended to be a diagnostic and comparative management tool for college and university leaders. The goal is to utilize these results to engage in ongoing discussions and to develop and implement plans to continually improve satisfaction in each of the work life areas listed earlier for all faculty members.

The COACHE Report Results are provided in the Provost’s Report, the COACHE Dashboard, the COACHE Digital Report Portfolio, and the accompanying Supplemental Materials.

Provost’s Report A quick, at-a-glance review of the survey results overall. Response rates are provided for Loyola University Maryland as well as for the five comparison institutions. Each column represents the range of institutional means.

Reading the Provost’s Report Our mean score for each dimension is marked with a black diamond. ♦ Other institutions mean scores are marked with a black open circle. ○ The distribution of the entire cohort is represented by red, grey, and green boxes. (See on next slide.) A score in the red section indicates that this score falls within the bottom 30 percent of all institutions. A score in the green section indicates that this score falls within the top 30 percent of all institutions. A mark in the grey section indicates that this score is a middle-of-the-road result.

COACHE identifies an area as a strength for LUM if it falls in the top two of our comparison institutions and it also falls within the top 30 percent of all institutions. COACHE identifies an area as an area of concern for LUM if it falls in the bottom two of our comparison institutions and if it also falls within the bottom 30 percent of all institutions.

The COACHE Dashboard For each of the nine areas assessed, two adjacent triangles are present to compare our faculty members’ ratings to those of our selected comparison institutions’ (to the left) and to the entire cohort (to the right). Red triangles indicate areas of concern relative to the indicated comparison group. Green triangles indicate areas of strength relative to the indicated comparison group. Grey triangles suggest unexceptional performance relative to the indicated comparison group. Empty triangles indicate insufficient data to make meaningful comparisons.

Basic Overview of Data Type Raw data/Likert numbers Green/Red Arrows Peer/National comparison arrows Longitudinal change from 2011 Ratio questions (percentages, pie charts, etc.)

To further understand these comparisons, effect sizes of small, moderate, and large are indicated. Insignificant effect sizes are left blank. Following the COACHE Dashboard, a dashboard with individual item responses is provided. This allows for a more detailed and nuanced analysis of the results. Additionally, comparisons to COACHE data from prior years are provided. Improvements over time are indicated with a plus-sign (+) and declines over time are indicated with a minus sign (-). These data are only provided when the same question was asked in the past and again at present.

Other Data Provided Additional data are provided separately for those items that do not use a five-point Likert scale or for other reasons do not provide a mean score for comparison purposes. Examples include when faculty members are asked to list the two best and worst aspects of working at LUM, selected from a list of common characteristics of the academic workplace. Additionally, a thematic analysis is provided for responses to open-ended questions and demographic characteristics of the respondents to the survey are provided.

The COACHE Digital Report Portfolio Provides access to an online tool for survey data analysis and the mean comparisons and frequency distributions for all survey results overall, by tenure status, rank, gender, and race/ethnicity. Means and frequency distributions are provided for LUM, our selected comparison institutions, and for all institutions participating in the survey. These should each be considered in interpreting the data.

Helpful Framing Questions for Conversations First takes: What is surprising? How are the data consistent with your perceptions of our institution? Group trends: Are there significant differences in the perception of some faculty (by gender, rank, tenure status, or within divisions) that create opportunities or raise concerns? Priorities and Comparisons: Considering the current circumstances at Loyola, are some ingredients or areas more important than others?

Questions to Consider in Reviewing the Data and to Guide Decisions Moving Forward  What has the biggest impact on our lives as faculty?  Are there areas of strength we want to nonetheless improve or cultivate in comparison to peer institutions, perhaps to be distinct?  Are there areas of relative weakness or dissatisfaction that nonetheless don’t warrant our attention right now?  Is there one item that can be addressed relatively quickly that you would prioritize?  Is there one item that would be your top priority, even if it requires focused effort over time or additional resources?

Worth a Closer Look In addition to major trend lines, sometimes the data merit a closer look to get the full picture and to ask targeted questions. You might call these outliers or “wiggly data.” Some examples: Huh? Moments: one-time surprises double-red arrows > 3.00: satisfaction but less so than peers double-green arrows < 3.00: dissatisfaction but less so than peers Mixed Bag: red-green pairings may point to peer-specific differences from national trends Hot-button: issues of particular concern at time of survey Divergences: group-specific variance on area of strength or weakness Long arcs: Modest changes over time may point to something big Blips that Count: Small questions may point to something larger

Closer Look Example: Midcareer Mentoring

Next Steps & Conclusions