Comprehensive Evaluation: Institutional Effectiveness Committee Recommendations Presentation to College Council Office of Institutional Effectiveness.

Slides:



Advertisements
Similar presentations
360 Degrees: Conducting a Comprehensive Evaluation of Your Integrated Planning Processes Bri Hays Jill Baker San Diego Mesa College RP Conference April.
Advertisements

POSTER TEMPLATE BY: Increasing Student Growth and Achievement A Systems Approach: Improving Our Teacher Evaluation System Dawn.
Columbia-Greene Community College The following presentation is a chronology of the College strategic planning process, plan and committee progress The.
Outcomes, Assessment and Improvement Student Learning Outcomes Implementation at Crafton Hills College.
Strategic Planning Summit GAP/Committee Chairs/IE December 5,
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Accreditation, SLOs and You – What are the patient outcomes? or Will the patient survive? Facilitators: Janet Fulks and Phillip Maynard.
Assistant Principal Meeting August 28, :00am to 12:00pm.
February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 1 Integrating Student Learning into Program Review Barbara Wright Associate Director,
2015 Governance Survey Results Planning and Resource Council (PaRC) June 17, 2015 E. Kuo & J. Marino-Iacieri FH IR&P.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
Assessment for Student Learning Kick-Off: Assessment Fellows Assessment Coordinators Pat Hulsebosch Ex. Director-Office of Academic Quality August 28,
S LOs What is New and Due in the Assessment Process: A focus on continued improvement in academic year NOTE: For reference, hyperlinks to resources.
Response due: March 15,  Directions state that the report must “focus on the institution’s resolution of the recommendations and Commission concerns.”
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
School Development Implementation and Monitoring “Building a Learning Community”
Accreditation Update Self-Study Progress and Review MPC Flex Days Spring 2015.
INSTRUCTIONAL LEADERSHIP TEAM CAMPUS IMPROVEMENT PLANNING MARCH 3, 2016.
Shared Services Initiative Summary of Findings and Next Steps.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
SPRING ASSESSMENT OF COLLEGE PROCESSES.
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
Middle States Re-Accreditation Town Hall September 29, :00-10:00 am Webpage
Advanced Writing Requirement Proposal
Integrated Planning for GCC
Erik Shearer, Professor of Art, Accreditation Faculty Co-chair
How an Assessment Framework helped revitalize Program Review at JCCC
Orange Coast College Institutional Effectiveness Team
Finding/Creating Meaning in SLO Assessment
Achieving the Dream Mark A. Smith.
Advancing Student and Educator Growth through Peer Feedback
LMU’s WASC Special Visit:
Program Assessment: Student Learning Outcomes
Institutional Effectiveness Plan
Assessment Cycle and Academic Effect
ACCJC 18-Month Follow-up Report
Overview – Guide to Developing Safety Improvement Plan
Institutional Program Review 2017 Update
Overview – Guide to Developing Safety Improvement Plan
Foothill College Accreditation Self-Study Update
Improving the First Year: Campus Discussion March 30, 2009
Integrated Budget Process as Evidence of Institutional Effectiveness
HARNESSING VOICES OF SUPPORT FOR PROGRAM REVIEW
College of Alameda Integrated Planning and Budgeting Process
Assessment and Program Review Learning Centers
UMKC General Education Revision - Background June 7, 2016
Institutional Effectiveness Presented By Claudette H. Williams
Accreditation Summary
Bakersfield College Annual Program Review Work Session 2013
Gary Carlin, CFN 603 September, 2012
Program Review and Accreditation
Yuba Community College District Board of Trustees Meeting
Program Review Teaching and learning committee Santa ana college
Course Evaluation Ad-Hoc Committee Recommendations
Implementing Race to the Top
Administrator Evaluation Orientation
Presented by: Skyline College SLOAC Committee Fall 2007
Orientation to the Accreditation Internal Evaluation (Self-Study)
Assessing Academic Programs at IPFW
South Seattle Community College
Foothill College Governance Redesign Update
Accreditation & Institution Set Standards
Vernon Martin, ASCCC Accreditation Committee, Sierra College
Streamlining the Program Review Process
Accreditation follow-up report
Foothill College Strategic Objective - Governance
SLO Assessment and Program Review
Mesa Community College
Institutional Self Evaluation Report Team Training
Presentation transcript:

Comprehensive Evaluation: Institutional Effectiveness Committee Recommendations Presentation to College Council Office of Institutional Effectiveness & Institutional Effectiveness Committee November 11, 2017

Background Evaluate Continuous Improvement Processes (3 year cycle) Program Review Student Learning Outcomes (SLO)/Administrative Unit Outcomes Assessment Planning College Council reviewed/informed of: Evaluation plan (fall 2016) Data collection efforts (spring 2017) Preliminary themes presented (spring 2017)

Discussion Plan Institutional Effectiveness Committee (fall 2017) Review detailed results Make recommendations for improvement Review Institutional Effectiveness Committee recommendations with campus College Council (November 2017) Planning councils (December 2017- February 2018) Academic Senate – 10+1 (December 2017- February 2018) Classified Senate (December 2017- February 2018) Final Recommendations Academic Senate 10+1 (March 2018) College Council (March 2018)

Evaluation Process Two component evaluation: Content: Focus Groups with each of the four wing planning councils, Academic Senate, & Classified Senate Campus-wide perception and opinion survey Content: Program Review process & structure SLO/AUO Assessment process & structure Planning process & structure Support Timeframe Impact Communication TracDat

Main Themes & Considerations Desire to keep processes the same for a period of time Lessen the culture of fear Dialogue & utility of processes needs improvement AUO data collection, peer review, staff/management hiring, staff development Some indication process cycles are too short Integration across processes not easily observable College goals not always driving plans, linked afterwards Broaden participation, communication and training

Program Review Develop a stronger mechanism for departments to identify and request department specific data for instructional programs. Consider redesigning current form, timeline for requests and communications about the process. Improve peer review process for both instructional and support departments. Consider process, participants, training and timeline. Redesign SLO/AUO synthesis inside program review to be more meaningful and less redundant. For instructional wing, consider SLO area to be more focused on program SLO evaluation. Improve prompts and support for CTE faculty completing both the biennial and core indicator areas in Program Relevancy.

SLO/AUO Assessment No changes to course SLO process. Evaluate each department’s AUOs/KPIs to ensure they are measurable and meaningful. Clarify AUO process, three-year cycle and training materials to emphasize that AUOs/KPIs can change annually as needed.

Planning/Annual Resource Requests (ARRs) Better articulate the relationship between college goals/objectives/priorities, departmental planning strategies and SLOs/AUOs. Strengthen planning strategies to be less focused on resource needs and more focused on improvement. Improve the integration of categorical, ancillary and ASOCC monies into the planning/ARR process. Improve classified and management staffing ARR process and prioritization of positions. Consider linkage to staffing plans or as a separate process. Clarify the prioritization and funding mechanism for Professional Development ARRs. Increase communication about the results of Annual Resource Requests to departments and the campus.

TracDat Improve navigation and streamlining of program review module. Develop training videos to help on demand support. Improve tracking reports of processes in TracDat. Support further integration of technology into TracDat (e.g., Canvas, Microsoft BI).

Overall – All Processes Increase classified staff participation and collaboration in all phases of program review, AUOs and the planning process. Dialogue about results of program review, SLOs and planning strategies outside of the process. Consider department, division or wing meetings as possible discussion arenas. Consider length and alignment of process cycles. Ongoing training of processes to keep purpose and outcomes in the forefront.

Comprehensive Evaluation Report Data Analyzed by Office of Institutional Effectiveness Click Here for Details

Comprehensive Evaluation: Preliminary Themes from Focus Groups Presented to College Council May 2, 2017

Preliminary Themes: Program Review Improved since prior cycle with data analysis and distinction between Instruction/Support wings More program-specific data support needed CTE areas duplicating external accreditation Tying CSLO/AUO results into program review not clear/seems redundant Peer review process needs refining (training, timing, incorporating feedback) Classified staff involved to varying degrees (mostly not involved or only on front end; no closing the loop)

Preliminary Themes: SLO/AUO Assessment SLOs compliance-focused and not being used in meaningful way Culture of fear surrounding assessment Feels punitive, so results don’t show need for improvement AUOs useful, but cumbersome Data collection problematic (sustainability, difficulty getting data, unsure of data needed) Need help developing assessments/KPIs

Preliminary Themes: Planning Integration across processes not easily observable College goals not always driving plans; typically addressed after strategies developed ARR and BSB processes working to get resources Perception some use BSBs in lieu of ARRs Staff/Management ARR prioritization/funding not meeting department demand and not reflecting “critical” or “mass” campus needs ARR decisions not being communicated widely

Preliminary Themes: Overall Overall perception that only minor tweaks are needed in processes Desire to keep processes the same for a period of time (changes burdensome to re-learn) Provide ongoing training for all processes for all constituents (keep fresh) Lessen culture of fear Dialogue outside of processes needs improvement Concerns about broad enough participation within departments (“experts” within departments do it all) Results from program review/assessment primarily being used for ARR justification

Preliminary Themes: TracDat PROS: Data collected in central location Reporting/summarizing capabilities Better than prior MS office based collection CONS: Navigation within TracDat not intuitive Reports hard to read & access Involves continuous training to use