AY 2012 LBRT Instructional Program Review Process Data Description Process – Timeline Rev. 10-15-12.

Slides:



Advertisements
Similar presentations
TRANSITIONING WORKFORCE STUDENTS INTO HIGHER EDUCATION.
Advertisements

Intro. Website Purposes  Provide templates and resources for developing early childhood interagency agreements and collaborative procedures among multiple.
Program Review Data Collection Project Brought to you by: The Data Delivery Team (DDT)
Campus Improvement Plans
Mississippi Department of Education Office Of Curriculum and Instruction 1 Curriculum and Instruction Updates Management Information Systems Data Conference.
1 School Year 2006–07 Accounting and Budgeting Committee Video Teleconference Kim Thompson OSPI - School Apportionment and Financial Services
Updated Principal Training October 15, 2014 Part 3 Attestations – Section 1119 Hiring requirements & Use of Funds Part 2 GA PSC CAPS Tool Part 1 HiQ Overview.
Accreditation Update COLLEGE of Alameda Spring 2015.
Jennifer Dodd 1/20/15 College Credit Plus. Participating Institutions Ohio Public secondary schools must participate Ohio Private secondary schools may.
Performance Management Open Information Session Spring 2009.
AY 2010 CTE Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev
Strategic Plan & System Initiatives Update September 2008.
1 The Journey to Reaffirmation “Systematic Based Evaluation” Spring 2009 Faculty/Staff Conference Southern University at Shreveport January 12, 2009 Planning,
Early Start Nicholls State University Spring 2012.
AY 2011 Unit Review Process Process – Timeline Rev
6 th Annual Focus Users’ Conference 6 th Annual Focus Users’ Conference FAFSA and Financial Aid Presented by: Natasha Leon Presented by: Natasha Leon.
AY 2011 CTE Instructional Program Review Process Data Description Process – Timeline Rev
2015 ARPD Unit Review Process
AY 2009 Unit Review Data Delivery Plan Data Description Process – Timeline Rev
Accelerated Instructional Program Review Handbook Spring 2007.
AY 2007 Comprehensive & Annual Program Review Data Delivery Plan Data Description Process – Timeline Rev
San Joaquin Delta College Flex Calendar Program General Flex at Delta Types of Activities Administration of Program Process Filling Out the Flex Contract.
Graduate Degree Progress & Clearance Graduate School Office Amy Gillett and Amy Corr.
AY 2008 Comprehensive & Annual Program Review Data Delivery Plan Data Description Process – Timeline Rev
AY 2009 Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev
AY 2012 Unit Review Process Process – Timeline Rev
2015 ARPD General & Pre- Professional Program Review Data Description ARPD Process Rev. 18Sept15.
Submitting Course Outlines for C-ID Designation Training for Articulation Officers Summer 2012.
2015 ARPD CTE Program Review Data Description ARPD Process Rev. 18sept15.
2014 ARPD Unit Review Process ARPD Process Rev. 9Dec14.
AY 2011 LBRT Instructional Program Review Process Data Description Process – Timeline Rev
Adult Education Block Grant Webinar October 23, 2015
AY 2006 Annual Program Review Data Delivery Plan Data Description Process – Timeline Rev
Syllabus Design and Resources, Part 1
AY 2012 CTE Instructional Program Review Process Data Description Process – Timeline Rev
Merit Program  Overview – General Information  Employee Eligibility  Merit Allocation Pools & Funding  Merit Awards, Process, Rules  Draft.
TTI Performance Evaluation Training. Agenda F Brief Introduction of Performance Management Model F TTI Annual Performance Review Online Module.
2014 ARPD General & Pre- Professional Program Review Data Description ARPD Process Rev. 12Dec14.
AY 2013 Unit Review Process Process – Timeline Rev
New York State DOH Health Home Care Management Reporting Tool (HH-CMART) Support Calls – Session #3 March 6,
AY 2013 CTE Instructional Program Review Process Data Description Process – Timeline Rev
AY 2013 LBRT Instructional Program Review Process Data Description Process – Timeline Rev
Thesis Defense and Submission 1 Spring  Register for Spring semester  Deadline to submit thesis to GPS: Friday, April 22 nd at NOON  Deadline.
AY 2010 LBRT Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev
Institutional Effectiveness at CPCC DENISE H WELLS.
INCENTIVE FUNDING UNIVERSITY OF UTAH Created on 2/17/2016.
CAA Review Joint CAA Review Steering Committee Charge Reason for Review Focus Revision of Policy Goals Strategies Milestones.
16-19 Accountability Measures. When Outcomes from summer 2016 (for students on 2 year courses). That is enrolments September First publication:
Purpose The intent of this presentation is to provide a high- level overview of graduate and leaver responses. It is not intended to look at the responses.
© 2014, Florida Department of Education. All Rights Reserved. Developmental Education Accountability Connections Conference 2015 Division.
PERSONNEL DEVELOPMENT PROGRAM Webinar for 325D and 325K Grantees Completing the ED Grant Performance Report (ED 524B) for the Annual Performance.
Academic Program Viability Report February 2010 Florida Association of Institutional Research 2010 Annual Conference.
End of Year Report and Renewal Process
The University of Delaware Higher Education Consortia
Required Data Files Review
Annual Report Georgetown ISD 2016 Accountability Rating:
Overview Background UPS Operational Policy TC 4
Conducting the performance appraisal
CCA Hawai‘i Summit UH Transfer Data
Conducting the performance appraisal
Virtual Network Meeting: Consolidated Application
Strong Workforce Program Funding Implementation
Your session will begin shortly
Annual Report Public Hearing
for Instructors and Roster Contacts
Articulation Agreements and Career Tech
for Instructors and Roster Contacts
for Instructors and Roster Contacts
Disclosure This presentation is intended as a high level overview of TRS reporting. This presentation should not be viewed as a comprehensive overview.
Strong Workforce Program Funding Implementation
Presentation transcript:

AY 2012 LBRT Instructional Program Review Process Data Description Process – Timeline Rev

Purpose The purpose of this presentation is describe the process we follow for the local Comprehensive review, the system required Annual Program Reviews, and to describe how the data is derived. We have been asked to produce an annual program review for each and every one of our instructional programs and units. They are required of each community college in the system and will be taken to the U of H Board of Regents for their purview If you are normally scheduled to do a comprehensive review or are “Jumping”, you will need to complete a comprehensive review this year. Additionally, every instructional and non-instructional program will do an Annual Program Review this year. Liberal Arts is not doing a comprehensive review this year. Click here for the Comprehensive Program-Unit Review Cycle and Schedule Comprehensive Program-Unit Review Cycle and Schedule 2

Reason for Program Review Program Review is Assessment The review of a program should be an on-going, year-round, reflective process. The review of a program should be an on-going, year-round, reflective process. Program review processes assure educational quality and help programs to evaluate and improve their services. Program review is an opportunity for self-study, self-renewal, and an opportunity to identify the need for improvement. A robust program review process is one mechanism available to the college to improve student success. 3

What are we doing to improve our program review process? Upon conclusion of every program/unit review cycle, the IR Office takes extra care to ensure that we are improving our program/unit review process on campus. This is accomplished by sending out questionnaires specific to the groups, and by meeting with the various groups across campus and collecting their feedback. Your suggestions for improving this process are then published to the Program Review website and have been linked here for your convenience. AY 2011 Program/Unit Process Improvement Summary AY 2011 Program/Unit Process Improvement Summary Based upon the feedback we received from everyone last year from our program-unit review process improvement focus groups, the following changes have been made and have been incorporated into the planning of this year’s review: In order to assure the best possible attendance for our annual training, the VC for Academic Affairs will notify the Chancellor, VC’s, and Directors, when training is ready, and the training will be scheduled through the VCAA’s office and Secretaries. It will be the responsibility of the VC’s and Directors to disseminate the day, time, and location of training to Writers and Initiators. In order to improve navigation on the program review website, our web developer has reorganized the current site and will be responsible for publishing documentation going forward. To provide the best possible support for training on campus we will continue to have 3 separate training sessions this year, one for Units, one for Liberal Arts, and one for CTE programs. 4

What else are we doing to improve our program review process ? There were some performance issues with the online tool last year reported by Writers. Joni has taken your concerns to the appropriate party at the system office and has asked them to try to work some of the issues out. You said that the training you received last year regarding changes to the Instructional Program Review Comprehensive Template was helpful. We will continue to provide a brief overview of the templates upon conclusion of our normally scheduled training. All annual and comprehensive templates have been updated since last year based on your feedback, and on an evaluation of the templates by CERC. An additional step to our Comprehensive Program/Unit Review Process document was added to ensure timely delivery of the budgetary data needed to complete an instructional program review. An additional step to our Comprehensive Program/Unit Review Process document was added to ensure that VC’s and Directors took more ownership for our local program/unit review process by providing training specific to their groups (i.e. the Vice Chancellor for Academic Affairs would provide training to the Academic Support Group) An additional step to our Comprehensive Program/Unit Review Process document was added to ensure that all of the suggested improvements we commit to every year through our Program/Unit Process Improvement efforts are completed by the responsible party prior to leaving for Summer. This is the improved process we are using this year: Comprehensive Program-Unit Review Process Comprehensive Program-Unit Review ProcessComprehensive Program-Unit Review ProcessComprehensive Program-Unit Review Process 5

What is different this year?  Program Student Learning Outcomes are now required for all Liberal Arts annual reviews in the community college system.  Under the online Web Submission “Cost per SSH” tab there is a new requirement for the entry of Tuition and fees, which are now part of the Overall Program Budget Allocation for all instructional programs. Joni will be uploading all of the budget information you need this year.  Under the Web Submission “Description” tab there is a new requirement for entry of the web address to the location of your last comprehensive review. Also required is the date completed. Please link your last comprehensive review here—not this year’s comprehensive.  Data elements with asterisks next to them are now used for health calls.  In the online Web Submission section there is a new “P-SLO” tab, requiring the following new information: Expected Level of Achievement, Courses Assessed, Assessment Strategy/Instrument, Results of Program Assessment, Other Comments, and Next Steps. (see next slide for definitions) 6

What to enter under the P-SLO tab Expected Level of Achievement Describe the different levels of achievement for each characteristic of the learning outcome(s) that were assessed. What represented “excellent,” “good,” “fair,” or “poor” performance using a defined rubric and what percentages were set as goals for student success (for example: “85% of students will achieve good or excellent in the assessed activity.”) Courses Assessed List the courses assessed during the reporting period. Assessment Strategy/Instrument Describe what, why, where, when, and from whom assessment artifacts were collected. Results of Program Assessment The % of students who met the outcome(s) and at what level they met the outcome(s). Other Comments Include any information that will clarify the assessment process report. Next Steps Describe what the program will do to improve the results. "Next Steps" can include revision to syllabi, curriculum, teaching methods, student support, and other options. 7

Jumper Defined A Jumper is a locally defined term that is used to describe an instructional program or non-instructional program (unit) that has decided to jump out of their normally scheduled slot for their comprehensive program reviews and into this years cycle. Jumping into this years comprehensive cycle means that you will have an opportunity to be considered for any budgetary decisions that will be made in this years budget process. Jumpers will still have to do their comprehensive review on their next scheduled review. Jumping does not affect the existing schedule—you are voluntarily doing an extra review to be considered in this budget cycle. 8

I belong to the Liberal Arts Program… which form do I use? COMPREHENSIVE REVIEWS The Liberal Arts program is not required to do a comprehensive review this year. New template linked below for your reference… The Liberal Arts program is not required to do a comprehensive review this year. New template linked below for your reference… Comprehensive Instruction Program Review Template Comprehensive Instruction Program Review Template ANNUAL REVIEWS Your programs data table is available on-line using the link below. You should have everything you need to begin writing your review within the web submission tool. Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. Your programs data table is available on-line using the link below. You should have everything you need to begin writing your review within the web submission tool. Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. UHCC Annual Report of Program Data Web Submission Tool UHCC Annual Report of Program Data Web Submission Tool 9

Terminology / Timing The Census freeze event is the fifth Friday after the first day of instruction. The End of semester freeze event is 10 weeks after the last day of instruction. FISCAL_YR_IRO: Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters. 10

Instructional Program Review Data Elements Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting their data from the same place, and at the same snapshot in time (this is a good thing). The following slides will explain in detail what data has been provided to you for your annual instructional program review write up and how it has been calculated. 11

#1Number of Majors Count of program majors who are home ‐ institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. This is an annual number. Programs receive a count of.5 for each term within the academic year that the student is a major. A maximum count of 1.0 (one) for each student. 12

#2 Percent Change Majors from Prior Year In alignment with UHCC Strategic Planning Goals, General and Pre ‐ Professional Education programs are expected to grow by 3% per year. Data Source: Difference between the number of majors (#1) in the current year from the prior year, divided by the number of majors in the prior year. This methodology works for HawCC in the current reporting year as ( )/1093*100 = 19% 13

#3SSH Program majors in Program Classes The sum of Fall and Spring SSH taken by program majors in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, Summer SSH are included. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program. Not sure what your program classes are? Click here to find out: Courses Taught Aligned to Instructional Programs Courses Taught Aligned to Instructional Programs Courses Taught Aligned to Instructional Programs Courses Taught Aligned to Instructional Programs 14

#4SSH Non-Majors in Program Classes The sum of Fall and Spring SSH taken by non ‐ program majors (not counted in #3) in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program. 15

#5SSH in All Program Classes The sum of Fall and Spring SSH taken by all students in classes linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 16

#6 FTE Enrollment in Program Classes Sum of Student Semester Hours (SSH) taken by all students in classes linked to the program (#5) divided by 30. Undergraduate, lower division Full Time Equivalent (FTE) is calculated as 15 credits per term. Captured at Census and excludes students who have already withdrawn (W) at this point. 17

#7Total Number of Classes Taught Total number of classes taught in Fall and Spring that are linked to the program. Concurrent and Cross listed classes are only counted once for the primary class. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 18

LA Program Scoring Rubric Definitions Your program health is determined by 3 separate types of measures: Demand, Efficiency, and Effectiveness. This slide explains why these measures were chosen to determine program health. Demand: A seeking or state of being sought after. i.e. your programs ability to attract new students every year based on your offering. i.e. your programs ability to attract new students every year based on your offering. Efficiency: Acting or producing effectively with a minimum of waste, expense, or unnecessary effort. i.e. your programs ability to use its resources in the best possible way. Effectiveness: Stresses the actual production of, or the power to produce an affect. i.e. your programs ability to produce the desired result. 19

Determination of Program’s Health This year the system office will calculate and report health calls for all instructional programs using academic year 2012 data. If you are interested in how the Liberal Arts Health Calls were determined, refer to the rubric linked here: Liberal Arts Health Call Scoring Rubric Liberal Arts Health Call Scoring Rubric 20

#8Average Class Size Total number of students actively registered in Fall and Spring program classes divided by classes taught (#7). Captured at Census and excludes students who have already withdrawn (W) at this point. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 21

#9Fill Rate Total active student registrations in program classes (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Captured at Census and excludes students who have already withdrawn (W) at this point. 22

#10FTE BOR Appointed Faculty Sum appointments (1.0, 0.5, etc.) of all BOR appointed program faculty (excludes lecturers and other non BOR appointees). Uses the “hiring status” of the faculty member – not the teaching/work load. Uses the Employing Agency Code (EAC) recorded in the Human Resources (HR) database to determine faculty’s program home. Faculty Teaching solely Remedial and/or Developmental Reading, Writing, or Mathematics are excluded from the counts in Liberal Arts and included in the counts for the Remedial/and or Developmental programs. Faculty with “split” assignments among Remedial/Developmental classes and Transfer level classes are reflected proportionally. Data as of October Data provided by UH Human Resources Office. Click here to find the number of BOR Appointed Program Faculty for your program: 2012 BOR Appointed Program Faculty 2012 BOR Appointed Program Faculty 23

#11 Majors to FTE BOR Appointed Faculty Number of majors (#1) divided by sum appointments (#10) (1.0, 0.5, etc.) of all BOR appointed program faculty. Data show the number of student majors in the program for each faculty member (25 majors to 1 faculty shown as “25”) 24

#12Majors to Analytic FTE Faculty Number of majors (#1) divided by number of Analytic FTE faculty (12a). 25

#12aAnalytical FTE Faculty (Workload) Calculated by sum of Semester Hours (not Student Semester Hours) taught in program classes divided by 27. Analytic FTE is useful as a comparison to FTE of BOR appointed faculty (#10). Used for analysis of program offerings covered by lecturers. 26

#13Overall Program Budget Allocation The overall program budget allocation = General Funded Budget Allocations (14a) + Special/Federal Budget Allocations (14b) + Tuition and Fees (14c) + Other fees The overall program budget allocation is automatically calculated when you enter your general funded budget allocation, special/federal budget allocation, other funds, or tuition and fees, into the online tool tab called, “Cost per SSH.” Again, Joni will upload the data for you this year. The overall program budget allocation is to be determined by the College and it includes: Salaries (general funds, special funds, etc.), overload, lecturers, costs for all faculty and staff assigned to the program, supply and maintenance, and tuition and fees. 27

#13aGeneral Funded Budget Allocation The general funded budget allocation = actual personnel costs + b budget expenditures 28

#13bSpecial/Federal Budget Allocation The expenditure of dollars from Federal grants 29

#13cTuition and Fees New this year to be included in your programs overall budget allocation is the amount collected for tuition and fees in the 2012 academic year. 30

#14Cost per SSH Overall Program Budget Allocation (#13) divided by SSH in all program classes (#5) 31

#15Number of Low Enrolled (<10) Classes Classes taught (#7) with 9 or fewer active students at Census. Excludes students who have already withdrawn (W) at this point. Excludes Directed Studies (99 series). Includes Cooperative Education (93 series) as there is a resource cost to the program. 32

#16Successful Completion ( Equivalent C or higher) Percentage of students actively enrolled in program classes at Fall and Spring census who at end of semester have earned a grade equivalent to C or higher. Captured at Census and excludes students who have already withdrawn (W) at this point. 33

#17Withdrawals (grade = W) Number of students actively enrolled (at this point have not withdrawn) at Fall and Spring census who at end of semester have a grade of W. 34

#18Persistence Fall to Spring Count of students who are majors in program at fall census (from Fall semester #1) and at subsequent Spring semester census are enrolled and are still majors in the program. Excludes students who have already withdrawn (W) at this point. Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 =.6774 or 67.74% 35

#19Unduplicated Degrees/Certificates Awarded Prior Fiscal Year Unduplicated headcount of students in the fiscal year reported to whom a program degree or any certificate has been conferred. (Sum of 19a and 19b). [sum of 19a and 19b, then unduplicated ] Uses most recent available freeze of fiscal year data. For ARPD year 2010, the most recent fiscal data on August 15, 2010, was from FY For ARPD year 2010, the most recent fiscal data on August 15, 2010, was from FY For ARPD year 2011, the most recent fiscal data on August 15, 2011, was from FY For ARPD year 2012, the most recent fiscal data on August 15, 2012, was from FY

#19aAssociate Degrees Awarded Degrees conferred in the FISCAL_YEAR_IRO. The count is of degrees and may show duplicate degrees received in the program by the same student. Uses most recent available freeze of fiscal year data. FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004 ‐ 2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…” “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004 ‐ 2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…” 37

#19bAcademic Subject Certificates Awarded The count is of program Academic Subject Certificates and may show multiple Academic Subject Certificates in the same program received by the student. Uses most recent available freeze of fiscal year data. 38

#19c Goal The UH Community Colleges have established increasing the number of Associate Degrees awarded as a priority. Beginning with the baseline of 2006 (see MAPS Degrees and Certificates Earned 2005 ‐ 2006) programs are expected to increase the number of Associate Degrees by 3% compounded annually. The number in 19c reflects the goal for the appropriate year. Uses most recent available freeze of fiscal year data. Example: 119 degrees was goal for 0910, 123 is goal for 1011 year (a 3% increase) 39

#19d Difference Between Unduplicated Awarded and Goal The percent difference between the number of Associate degrees awarded (#19a) and the goal (#19c). General Pre Professional Programs are expected to increase Associate Degrees awarded by 3% compounded annually. Uses most recent available freeze of fiscal year data. Example: The program’s goal is 119 (19c). The actual number of Associate degrees awarded (19a) is 154, which is 35 degrees more than the goal. Thirty-five is 29% greater than the goal. ( )/119*100 = 29% 40

#20Transfers to UH 4-yr Students with home campus UH Manoa, UH Hilo, or UH West Oahu for the first time in the reporting Fall who prior to the reporting Fall had UH community college (HawCC) as home campus. Also includes students who for the first time in the reporting Fall had Maui College as home campus and major ABIT or ENGT. This is a program measure. A student is included in the count of program transfers in as many programs in which they have been a major at the college. 41

#20aTransfers with degree from program Students included in #20 who have received a degree from the community college program prior to transfer. Does not include any certificates. 42

#20bTransfers without degree from program Students included in #20 who did not receive a degree from the community college program prior to transfer but did transfer. 43

#20cIncrease by 3% Annual Transfers to UH 4-yr Goal The UH Community Colleges have established increasing the number of students who transfer to UH 4 ‐ year institutions as a priority. Beginning with the baseline of 2006, General Pre Professional programs are expected to increase the number of students who transfer by 3% compounded annually. The number in 20c reflects the goal for the appropriate year. Source: UHCC Strategic Plan 2.4. Subset of all transfers, filtering for specific program. 44

#20dDifference Between Transfers and Goal The percent difference between the number of Transfers to UH 4 ‐ year institutions (20) and the goal (#20c). General Pre Professional Programs are expected to increase transfers to UH 4 ‐ year institutions by 3% compounded annually. Example: The program’s goal is 59 (20c). The actual number of transfers (20) is 68 – 9 transfers more than their goal. Nine is 15% greater than the goal. (68-59)/59*100 = 15% (68-59)/59*100 = 15% 45

#21Number of Distance Education Classes Taught Measures the number of classes taught with the mode of delivery as “Distance Completely Online.” In setting up the class, the college indicates the method of instruction used by the instructor in conducting the class. If the method is Distance Education, and the college indicates the “mode” of distance delivery as “Distance Completely Online” the class will be included in this count. 46

#22Enrollment Distance Education Classes At the Fall and Spring census, the number of students actively enrolled in all classes owned by the program and identified as Distance Completely On ‐ Line (#21). Does not include students who at Census have already withdrawn from the class. 47

#23(DE) Fill Rate Total active student registrations in program distance education classes (#22) (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Does not include students who at Census have already withdrawn from the class. 48

#24(DE) Successful Completion (Equivalent C or higher) Percentage of students enrolled in program Distance Education classes (#22) at Fall and Spring census who at end of semester have earned a grade equivalent to C or higher. 49

#25(DE) Withdrawals (Grade=W) Number of students actively enrolled in program Distance Education classes (#22) at Fall and Spring census who at end of semester have a grade of W. 50

#26Persistence (Fall to Spring not limited to Distance Ed) Students enrolled in program distance education classes at Fall census (#22) who at subsequent Spring semester census are enrolled in the college. Not limited to students continuing to take program distance education classes. Example: 31 students enrolled in DE classes in Fall 21 students of the original 31 students persist into any Spring class 21/31 =.6774 or 67.74% 51

Comprehensive & Annual Program Review Timeline Develop training materials for Instructional GPP program Provide Program Review Training to GPP Instructional Program Develop and publish BOR Appointed Faculty and Classes Taught aligned to program files The Instructional GPP program review is due to Interim VC for Academic Affairs/ UHCC ARPD website by EOB Friday November 30th. Follow up on last year’s list of suggested improvements for completion Edit instructional GPP glossary and health call scoring rubric and provide suggested changes to system office Plan this year’s program review based on suggested improvements from last year’s review Evaluate ARPD online web submission site for functionality and report bugs to system office Administer PR Process Improvement Sessions (Develop questionnaire, schedule sessions, collect feedback) Work with AC and Interim VCAA to determine what feedback needs to be taken back to UHCC IPRC Update documentation on website needed for program review Sept OctAug-Sept Jan 27th Jan 25th Sept Aug Nov 30Sept-Oct 52 Send Campus-wide update on program review process with link to data and due date Sept Summarize PR Process Improvement Feedback, communicate results to groups, publish to web. Jan 5th

AY 2012 Comprehensive & Annual LA Review Process Step 1Write your program review using the appropriate template or online submission. Step 2Send your documents (one Word document per review) to Interim VCAA Joni Onishi by no later than end of business, Friday November 30th, Step 3Interim VCAA will ensure that all required documents have been received and that they are adequate. Step 4Interim VCAA will forward all approved reviews to the Institutional Research Office for further processing. Step 5The Annual reviews will be sent to the System Office for review by the UH Board of Regents. Comprehensive reviews will be forwarded along as appropriate following CERC guidelines. Step 6All comprehensive reviews will finally be converted to PDF and posted to the Program Review Web Site. 53

Questions? The intention of this presentation was to provide a single source for all of the documentation related to the Comprehensive & Annual Program Review process—specifically for the Liberal Arts program. All of the documents you should need for your review have been linked directly into this presentation. If you need more information on this process please feel free to contact me: HawCC Institutional Analyst Shawn Flood HawCC Institutional Analyst Shawn Flood Shawn FloodShawn FloodMahalo! 54

CERC Comprehensive Program Review Template & Process James will now discuss the comprehensive instructional program review template, and briefly cover the current CERC process. Comprehensive Instruction Program Review Template Comprehensive Instruction Program Review Template If you need assistance please contact me: James Kiley Institutional Assessment Coordinator ( is best) Phone: Mahalo! 55