Florida Association of Institutional Research 2011 Annual Conference The Evolution of the Comprehensive Academic Program Review - Three Years After Inception.

Slides:



Advertisements
Similar presentations
What “Counts” as Evidence of Student Learning in Program Assessment?
Advertisements

Institutional Effectiveness (ie) and Assessment
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Campus-wide Presentation May 14, PACE Results.
Building New Baccalaureate Programs Dr. Amy Brown Asst. Professor of Elementary Education Kay Burniston Assoc. VP for Baccalaureate Programs.
Development of SPC Baccalaureate Programs A Model for Providing Baccalaureate Access to the Non-traditional Student 1.
The Evolution of Business Intelligence at SPC: One Year Later January 2013 Florida Association of Institutional Research Annual Conference.
New England Association for Schools and Colleges Re-Accreditation for Brandeis University Marty Wyngaarden Krauss Provost and Senior Vice President for.
NACEP Accreditation: Advancing Quality College Courses in High School and Improving Credit Transfer Kent Scheffel, President-Elect National Alliance of.
Student success is the first priority. Credit Hour Commitment Model Presenters  James Thomas Coraggio, Executive Director, Institutional Research and.
An Assessment Primer Fall 2007 Click here to begin.
PREPARING FOR SACS Neal E. Armstrong Vice Provost for Faculty Affairs July 13, 2004.
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
President’s Cabinet April 12,  Process review  The “why” for the plan  The draft plan  Q & A  Implementation.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
Institutional Effectiveness Operational Update Presentation made to the Indiana State University Board of Trustees October 5, 2001.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
Learning Centers Some Best (Promising?) Practices for Some Best (Promising?) Practices for Learning Support Centers in Higher Education 1.
State Assessment Meeting Thursday, June 20, 2013 Get Ready for College – Math: A MOOC Designed for Remediation.
FAIR Best Paper: Using Technology to Efficiently and Effectively Gather Information from Recent Alumni and Employers May 2010 Association of Institutional.
General Education Assessment Process Department of Academic Effectiveness and Assessment 2010.
Internal Auditing and Outsourcing
Using Institutional Effectiveness to Build a Culture of Performance Improvement 2007 SACS-COC Annual Meeting Department of Institutional Research and Effectiveness.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Professional Development Day October 2009 Data Matters! Finding and Accessing Information at SPC.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Commission on Accreditation for Respiratory Care The Site Visitors Are Coming! Transitioning from Successful Self- Study to Successful Site Visit Bradley.
How Business Intelligence Transformed the Culture at SPC June 2013 State Assessment Meeting.
Periodic Program Review for Academics Affirming Excellence in Education LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Using Technology to Efficiently and Effectively Gather Information from Recent Alumni and Employers April 2009 Florida Association of Community Colleges.
Accelerated Instructional Program Review Handbook Spring 2007.
Dr. Constance Ray Vice President, Institutional Research, Planning, & Effectiveness.
Student success is the first priority. Increasing Student Response Rates Presenters  Magaly Tymms, Associate Director, Academic Effectiveness and Assessment.
From Cyclical to Continuous Improvement: Assessment Feedback and Program Metrics Derek J. Herrmann, Coordinator of University Assessment Services Kristen.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Improving the Institutional Effectiveness Process 1.
On-line briefing for Program Directors and Staff 1.
Primary Functions of Program Directors Leadership Curriculum Management and Coordination Coordinate Program Assessment Marketing, Recruitment and Admissions.
University of Central Florida Assessment Toolkit for Academic, Student and Enrollment Services Dr. Mark Allen Poisel Dr. Ron Atwell Dr. Paula Krist Dr.
April 8, Agenda Charge of the Group SACS/QEP Update/Overview 5 th Year Interim Report Assigned Areas Next Steps.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Florida Tech’s University Assessment Committee For A Continuing Culture of Assessment.
Florida Association of Institutional Research 2008 Annual Conference Beyond Compliance…Using a Model of Comprehensive Academic Program Review to Improve.
ELearning Committee Strategic Plan, A Brief History of the ELC Committee Developed and Charged (2004) CMS Evaluation and RFP Process (2004)
Continual Commitment to Accreditation February 1, 2011.
Using Institutional Effectiveness to Build a Culture of Performance Improvement 2007 SACS-COC Annual Meeting Department of Institutional Research and Effectiveness.
UWF SACS REAFFIRMATION OF ACCREDITATION PROJECT Presentation to UWF Board of Trustees November 7, 2003.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Southeastern Association for Community College Research 2008 Conference Beyond Compliance…Using a Model of Comprehensive Academic Program Review to Improve.
SACS Coordinators Meeting Wednesday, June 6, 2012 Timothy Brophy – Director, Institutional Assessment Cheryl Gater – Director, SACS Accreditation.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Southeastern Association for Community College Research 2008 Conference Using Technology to Efficiently and Effectively Gather Information from Recent.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Distance Learning and Accreditation Heather G. Hartman, Ph.D. Brenau University Online Studies and SACS Liaison.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Columbia Basin College Plenary I: Mission and Mission Fulfillment Rich Cummins Melissa McBurney 1.
Institutional Effectiveness at CPCC DENISE H WELLS.
February, MansourahProf. Nadia Badrawi Implementation of National Academic Reference Standards Prof. Nadia Badrawi Senior Member and former chairperson.
Accreditation 2007 Undergraduate Council September 26, 2005.
Accreditation 101 STEVEN SHEELEY, PHD VICE PRESIDENT – SACSCOC GACRAO NOVEMBER 2, 2015.
Academic Program Viability Report February 2010 Florida Association of Institutional Research 2010 Annual Conference.
Middle States Re-Accreditation Town Hall September 29, :00-10:00 am Webpage
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
The Application Process Understanding the IERs (Institutional Eligibility Requirements ) 2106 TRACS Annual Conference.
New Program Director Workshop:
Assessment Cycle and Academic Effect
Institutional Effectiveness Presented By Claudette H. Williams
Assessing Student Learning
Fort Valley State University
Presentation transcript:

Florida Association of Institutional Research 2011 Annual Conference The Evolution of the Comprehensive Academic Program Review - Three Years After Inception

February 2011Florida Association of Institutional Research2 Comprehensive Academic Program Review Department Academic Effectiveness St. Petersburg College P.O. Box 13489, St. Petersburg, FL (727) FAX (727) Presenters  Leigh Hopf, Director, Baccalaureate Program Support Services  James Coraggio, Director of Academic Effectiveness and Assessment

February 2011Florida Association of Institutional Research3 SPC Background  SPC, established in 1927, is the oldest 2-year college in Florida  First Community College in Florida to offer 4 year degrees (2002)  9 Campuses throughout the county  FTE: 18,707 (LD), 2,055 (UD)  Opening Fall 2010 credited enrollment: 32,429  Annual headcount: 61,592

Purpose CAPR model was originally presented at FAIR three years ago (2008) Since that time CAPRs in 23 programs have been completed This presentation will revisit the CAPR model and discuss its evolution February 2011Florida Association of Institutional Research4

February 2011Florida Association of Institutional Research5 Compliance Issues with IE  SACS’s Peer Review Research Project revealed that in the area of Institutional Effectiveness (2008)  62% of institutions in off-site reviews were deemed non-compliant and  27% of institutions in on-site visits were deemed non-compliant  IE was the second most identified area behind faculty qualifications

February 2011Florida Association of Institutional Research6 SPC Experience with IE  As part of the SACS reaccreditation process, SPC spent a considerable amount of time and effort documenting and detailing their IE processes  The result…SPC had “zero” compliance issues in the area of IE during the off- site reviews

February 2011Florida Association of Institutional Research7 IE Evidence  SACS Suggested Documentation for CR 2.5  Evidence of linkage of IE to institutional mission  Institutional plans and budgets that demonstrate linkage of assessment findings to planning at all levels  Minutes of appropriate (IE related) unit, committee, task force meetings…  Documentation that relates to IE, such as budget preparation instructions, minutes of budget presentation meetings, annual reports, annual assessment updates, IE reports  Samples of specific actions taken to improve the IE process and/or results from that process

February 2011Florida Association of Institutional Research8 Performance Improvement  From Compliance to Performance Improvement

February 2011Florida Association of Institutional Research9 IE at SPC “Institutional Effectiveness is the integrated, systematic, explicit, and documented processes of measuring performance against the SPC mission for purposes of continuous improvement of academic programs, administrative services, and educational services offered by the college.” “Closing the Loop”

February 2011Florida Association of Institutional Research10 Changes & Improvements  Evaluated individual assessment instruments and current assessment processes  Found new ways to integrate assessment with the educational process  Focus on fewer, yet meaningful improvements (action items)  Use multiple assessment methods (direct & indirect)  Involve faculty in process  Integrate timing of assessments (3-year cycle)

February 2011Florida Association of Institutional Research11 Three-year Assessment Cycle Academic Program Viability Reviews (APVRs) conducted yearly APARs CAPRs

February 2011Florida Association of Institutional Research12 Changes and Improvements  Began looking at the ‘Big Picture’  Assessment focus on CHANGE through Quality Improvement  Faculty-driven assessment process; Academic Effectiveness serves in consultant capacity  Improved access and awareness of assessment information

February 2011Florida Association of Institutional Research13 Program Review Process CAPR process is a summative evaluation that includes two reports with multiple measures such as: –program-specific performance measures –profitability measures –economic data

February 2011Florida Association of Institutional Research14 Academic Program Viability Report  Published yearly  Included measures:  Program Graduates  Course Enrollment  Unduplicated Headcount  Total Placement  Economic Trend Data

February 2011Florida Association of Institutional Research15 CAPR Objectives Comprehensive Academic Program Review (CAPR) developed to meet three objectives within the academic assessment process: –To provide a comprehensive report that summarizes all elements of the program’s viability and productivity from a 360-degree perspective –To provide comprehensive and relevant program- specific information to key College stakeholders, such as the President’s Cabinet members, in order to make critical decisions regarding the continued sustainability of a program –To provide program leadership a vehicle to support and document actionable change for the purposes of performance improvement

February 2011Florida Association of Institutional Research16 Previous Program Review Model Traditionally, program reviews at SPC consisted primarily of a community focus group and a few occupational growth measures This information was presented to the President’s Cabinet for evaluation CAPR was designed to be more representative of a program’s quality and, as such, contains measures involving a number of stakeholder perspectives.

February 2011Florida Association of Institutional Research17 Elements of the CAPR Specific CAPR measures include the –program description with recent program accreditation information –program performance measures including enrollment, productivity, grade distributions, and fulltime/adjunct faculty ratios –program profitability measures –academic outcomes from recent end-of-program assessments –stakeholder perceptions including student surveys of instruction results, advisory committee minutes, and employer and recent alumni survey results –occupation trends and information –state graduate outcomes information –the program director’s perspective of program issues, trends, and recent success

February 2011Florida Association of Institutional Research18 Measures CAPR In order to ensure that the information presented in the CAPR is used appropriately, each measure includes: –a standardized definition of the performance measure –a reference to the source information used in the calculation

February 2011Florida Association of Institutional Research19 Program Performance Example Figure 5: Program Graduates Source: SPC Factbook, Table 31 Program Graduates

February 2011Florida Association of Institutional Research20 Program Profitability Example Figure 9: Fiscal Summary Source: PeopleSoft Financial Production database, report ID: ORGBUDSI Relative Profitability Index (RPI) is calculated by dividing a program’s income by the sum of its personnel costs and current expenses.

February 2011Florida Association of Institutional Research21 Academic Outcomes Example Digital Media/Multimedia Technology program was evaluated through an Academic Program Assessment Report (APAR) in Each of the program’s four major learning outcomes (MLOs) was evaluated.

February 2011Florida Association of Institutional Research22 Stakeholder Perceptions Example Figure 10: SSI Lecture Courses Source: Student Survey of Instruction Administration Site Purpose of the SSI survey is to acquire information on student perception of the quality of courses, faculty, and instruction, and to provide feedback information for improvement.

February 2011Florida Association of Institutional Research23 Occupation Profile Example The distribution of 2007 wage information is divided by percentiles for hourly and yearly wages.

February 2011Florida Association of Institutional Research24 Use of Results To encourage the use of results, the program director and provost are required to provide an action plan for improving the performance of the program Follow-up report on these results is required the following year CAPR process also includes a review of the CAPR documentation by the advisory committee and the President’s Cabinet

February 2011Florida Association of Institutional Research25 Action Plan Example Also includes sections for special resources needed and area(s) of concern/ improvement.

February 2011Florida Association of Institutional Research26 Stakeholder Perceptions CAPR has been well received by stakeholders (program directors, provost, and members of the President’s Cabinet) –“…provides a more representative picture of the overall quality and sustainability of an individual academic program.” - SVP, Baccalaureate Programs and University Partnerships –“…greatly enhanced SPC’s capacity to evaluate its lower division academic programs and to encourage the use of assessment results to improve College programs.” – VP, Information Systems, Business Services, Budgets, Planning

February 2011Florida Association of Institutional Research27 Ed Outcomes Site To provide a medium for completing the educational assessment reports as well as establishing a repository for program specific information, SPC developed an Educational Assessment Web site ( College administration and instructional staff are provided access to “completed” assessment reports including the CAPR Online access further encourages the use of assessment data as well as highlighting “best practices” across the college

February 2011Florida Association of Institutional Research28 Ed Outcomes Site – College Access

February 2011Florida Association of Institutional Research29 Ed Outcomes Site – PD Access

February 2011Florida Association of Institutional Research30 Ed Outcomes Site – PD Access

February 2011Florida Association of Institutional Research31 Ed Outcomes Site – CAPR Program Review is just one of the educational materials included in the Ed Outcomes site CAPRs are uploaded into the site as a complete document (PDF)

February 2011Florida Association of Institutional Research32 Ed Outcomes Site - CAPR

Additions to CAPR Summary section was replaced with the Program Director’s Perspective: Issues, Trends, and Recent Success. Section containing the names of the major employers in the program was added February 2011Florida Association of Institutional Research33

Additions to CAPR Unduplicated Headcount Enrollment measure was added FETPIP Placement Rates were also added February 2011Florida Association of Institutional Research34

Lessons Learned Process requires adequate staffing and resources Process should not be a ‘gotcha,’ it should be improvement focused and collegial in nature Requires support of senior leadership Need access to data and information February 2011Florida Association of Institutional Research35

Lessons Learned Involve faculty at start Need models for dissemination of reports as well as encouraging transparency of information Use process to share best practices across disciplines Be open to policy changes as a result of the gathered information February 2011Florida Association of Institutional Research36

February 2011Florida Association of Institutional Research37 Future Direction  New Accreditation and Baccalaureate Assessment Coordinator position has been created  CAPR and APVR program reviews for the baccalaureate programs will be one of the first responsibilities  APVR for the Baccalaureate programs will be conducted in 2011

February 2011Florida Association of Institutional Research38 Future Direction  Our ultimate goal is to provide stakeholders ‘timely’, ‘relevant’, ‘accurate’, and ‘interpretable’ data through:  Formatted (dashboard) style reports, and  On-demand customizable reporting, with  Valid, reliable, and standardized measures.

February 2011Florida Association of Institutional Research39 Questions Department of Institutional research and Effectiveness St. Petersburg College P.O. Box 13489, St. Petersburg, FL (727) FAX (727)

Florida Association of Institutional Research 2011 Annual Conference The Evolution of the Comprehensive Academic Program Review - Three Years After Inception