AY 2009 Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev. 11-4-09.

Slides:



Advertisements
Similar presentations
Closing PYE12 Opening PYE13 Monieca West, Brinda Berry Federal Programs Managers Arkansas Department of Higher Education May 3, 2012.
Advertisements

Tyler School Solutions Course Setup – Missouri State Reporting A supplement to the District Subjects Overview Quick Sheet Revised 09/18/14.
Program Review Data Collection Project Brought to you by: The Data Delivery Team (DDT)
Perkins IV National Definitions and State Reporting: The Impact on Data Collection in Texas Gabriela Borcoman Texas Higher Education Coordinating Board.
IL State Board of Education - 9/18/2007 Perkins IV - Secondary Indicators Carol Brooks Illinois State Board of Education.
Slide 1 Perkins IV Changes and Additions to Perkins Data, Tables and Reports August, 2009 Michele Dorschner, System Developer, Office of the Chancellor,
SUMMER SESSION 2011 New Summer Session guidelines Procedures and Simplified Personnel Action Form (PAF)
HOPE Changes DOE Workshop. HOPE Scholarship Basic eligibility requirements: –Unchanged for current seniors College Prep Diploma – 3.0 GPA Career Tech.
MoHealthWINs Colleges & The Grant Team: Partners In Discovery and Innovation MoHealthWINs Summer Training Sessions St. Louis, MO July 12, 2012.
DQI State Plan Accountability Requirements, Guidelines, Timeline, Student Definitions and Indicators John Haigh, U.S. Department of Education Savannah,
PEIMS is a Five Letter Word! Ruthie Pe’Vey Kneupper Educational Specialist, CTE Education Service Center, Region 20
Carl D. Perkins Career and Technical Education Improvement Act of 2006.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Using CTE/Perkins Data to drive Program Improvement Program.
AET/515 Spanish 101 Instructional Plan SofiaDiaz
AY 2010 CTE Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev
Early Start Nicholls State University Spring 2012.
BY Karen Liu, Ph. D. Indiana State University August 18,
AY 2011 Unit Review Process Process – Timeline Rev
“ This product was partially funded by a grant awarded under the President ’ s Community-Based Job Training Grants as implemented by the U.S. Department.
Perkins Update FY16 Federal Legislation Assistance Division Josh Miller Janet Cooper.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
AY 2011 CTE Instructional Program Review Process Data Description Process – Timeline Rev
Click to edit Master title style 1 What Data? California Community Colleges Chancellor’s Office Dr. W. Charles Wiseley, “Chuck” Career Technical Education.
AY 2009 Unit Review Data Delivery Plan Data Description Process – Timeline Rev
MoHealthWINs Colleges & The Grant Team: Partners In Discovery and Innovation MoHealthWINs Summer Training Sessions St. Louis, MO July 12, 2012.
Perkins Update July 9, 2015 Federal Legislation Assistance Division Josh Miller Janet Cooper.
NEW START-UP APPLICATION  Deadline to submit application is October 1 year prior to implementation  If proposal is ed, submit cover page.
AY 2007 Comprehensive & Annual Program Review Data Delivery Plan Data Description Process – Timeline Rev
San Joaquin Delta College Flex Calendar Program General Flex at Delta Types of Activities Administration of Program Process Filling Out the Flex Contract.
AY 2008 Comprehensive & Annual Program Review Data Delivery Plan Data Description Process – Timeline Rev
AY 2012 Unit Review Process Process – Timeline Rev
2015 ARPD General & Pre- Professional Program Review Data Description ARPD Process Rev. 18Sept15.
2015 ARPD CTE Program Review Data Description ARPD Process Rev. 18sept15.
2014 ARPD Unit Review Process ARPD Process Rev. 9Dec14.
Edward Byrne Memorial Justice Assistance Grant Application Workshop for Existing DTF Projects Multi-jurisdictional Task Forces & K-9 Resource Teams September.
Eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR.
AY 2011 LBRT Instructional Program Review Process Data Description Process – Timeline Rev
AY 2006 Annual Program Review Data Delivery Plan Data Description Process – Timeline Rev
MOVE ON WHEN READY DUAL ENROLLMENT OPPORTUNITIES.
AY 2012 CTE Instructional Program Review Process Data Description Process – Timeline Rev
2014 ARPD General & Pre- Professional Program Review Data Description ARPD Process Rev. 12Dec14.
The CAMP Performance Reporting Process Michelle Meier Nathan Weiss Office of Migrant Education U.S. Department of Education New Directors Meeting Phoenix,
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Program Assessment Technical Assistance Meetings December 2009.
AY 2013 Unit Review Process Process – Timeline Rev
AY 2012 LBRT Instructional Program Review Process Data Description Process – Timeline Rev
AY 2013 CTE Instructional Program Review Process Data Description Process – Timeline Rev
AY 2013 LBRT Instructional Program Review Process Data Description Process – Timeline Rev
University of Hawai‘i Community Colleges Use of Perkins Funds Carl D. Perkins Career and Technical Education Act of 2006.
Click to edit Master title style 1 Core Indicator Cohort Selection Carl D Perkins (Perkins IV) Career and Technical Education Act Dr. W. Charles Wiseley.
AY 2010 LBRT Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev
2006 Hybrid Program Review Data Delivery Plan Process/Data Description Process/Data Description Rev
Institutional Effectiveness at CPCC DENISE H WELLS.
16-19 Accountability Measures. When Outcomes from summer 2016 (for students on 2 year courses). That is enrolments September First publication:
Purpose The intent of this presentation is to provide a high- level overview of graduate and leaver responses. It is not intended to look at the responses.
Welcome to Financial Aid Night An overview of financial aid and the application process. Presented by the Ascension Parish Career Coaches.
Federal - Perkins IV Programs of Study (Pathways) Secondary/Postsecondary Links Improving Student Performance –Academic and Technical Skills –Graduation/Completion.
Texas Higher Education Coordinating Board Accountability System and IPEDS Outcome Measures Bill Abasolo Director of Strategic Planning and Analysis, Strategic.
Academic Program Viability Report February 2010 Florida Association of Institutional Research 2010 Annual Conference.
John Halpin, Associate Dean, Perkins & Work Experience
PRASFAA 2009 Fall Conference
Perkins IV Data and Accountability
CCA Hawai‘i Summit UH Transfer Data
Strong Workforce Program Incentive Funding Overview
REPORTS CTE 103.
CTE Data and Accountability Overview
Strong Workforce Program Funding Implementation
Annual Report Public Hearing
Articulation Agreements and Career Tech
Disclosure This presentation is intended as a high level overview of TRS reporting. This presentation should not be viewed as a comprehensive overview.
Strong Workforce Program Funding Implementation
Presentation transcript:

AY 2009 Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev

Purpose The primary purpose of this presentation is to provide clarity to all instructional programs on how the data is calculated for both our local Comprehensive and the system required Annual Program Reviews. We have been asked to produce an annual program review for each and every one of our instructional programs and units. They are required of each system CC and will be taken to the U of H Board of Regents for their review. So who has to do a program review this year? If you are normally scheduled to do a comprehensive or are “Jumping” you will do a comprehensive review this year. Additionally, every instructional and non-instructional program will do an Annual Program Review this year. Not sure if you’re scheduled for a comprehensive review or not? Click here for the Program-Unit Review Schedule Program-Unit Review ScheduleProgram-Unit Review Schedule 2

What are we doing to improve our program review process? Based upon observation and the feedback we received from you last year from our program review process improvement survey, the following changes have been implemented. Over the course of the summer the program review website was developed and most of the documentation was updated to reduce the amount of time it takes to get you the data you need for your review. Labels were added to each instructional data sheet to clarify the period of time the data was taken from. Health calls come to you pre-populated this year from the system office to ensure the consistent application of calculations across the UHCC system. Your overall program budget this year breaks out the general funded costs from the special/federal grant dollars to highlight how much of your budget is dependent upon grants. Academic Subject Certificates and a new data element, “Other Certificates Awarded” were added this year so programs receive credit for those credentials as well. Academic year data is provided for 0809 this year to include spring data as well as fall. Your data file has been embedding into the coversheet to reduce your formatting time. Only one document will be required for annuals this year. Four separate training sessions will be given this year in order to focus on specific groups. They are: –1.ATE Division Programs –2.Hospitality, Business Ed, and Nursing Division Programs –3.Liberal Arts Division Programs –4.Units 3

What’s a Jumper? A Jumper is a locally defined term that is used to describe an instructional program or non-instructional program (unit) that has decided to jump out of their normally scheduled slot for their comprehensive program reviews and into this years cycle. Why would anyone want to do that? Jumping into this years comprehensive cycle means that you will have an opportunity to be considered for any budgetary decisions that will be made in this years budget process. Jumpers will still have to do their comprehensive review on their next scheduled review. Jumping does not affect the existing schedule—you are voluntarily doing an extra review to be considered in this budget cycle. 4

I belong to an Instructional Program… which template do I use? Your programs data table has been embedded into your coversheet (for annuals) or the comprehensive program review template you will need for your review. You should have everything you need to begin writing your review. The blank templates and coversheets are linked below as well if you should need them. UHCC Annual Instructional Program Review Coversheet UHCC Annual Instructional Program Review Coversheet UHCC Annual Instructional Program Review Coversheet UHCC Annual Instructional Program Review Coversheet (ALL Instructional programs will need to complete this—even if you’re completing a comprehensive review) Instructional Comprehensive Program Review Template Instructional Comprehensive Program Review Template (Use this template ONLY if you are scheduled for a comprehensive program review this year or are jumping) 5

What’s new this year in the Annual Report of Program Data table? All of the changes listed below are for the 0809 academic year only. The Fall 06 and Fall 07 data is from last years program review, using last years routines, and using fall data only. New and Replacement Jobs are now prorated for programs offered by more than one college on the same island (if 3 community colleges on Oahu offer the same program they will divide the county jobs up proportionate to the number of majors at each college). This is for County data only. Annual data (summer-- for programs with required summer attendance, fall, spring). [note: this is not the traditional academic year. Annual data starting with Summer will allow us to get degree and certificate data for all 3 years] New measures for Distance Education Health Calls (overall, demand, efficiency, effectiveness) pre populated according to scoring rubric set by UHCC I-PRC and are pre-populated this year. New process to follow when Perkins IV Core Indicator goals are not met 6

Instructional Program Review Data Elements Wherever possible, previous year data was copied directly in from last years program review. Therefore, the columns labeled Fall 06 and Fall 07 reflect what was reported to you last year. The system office reported your 0809 data using today’s routines, which may vary from what was reported last year. This is due to the system office using “improved” routines over the ones used last year. Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting the data from the same place and at the same snapshot in time (this is a good thing). The following slides will explain in detail what data has been provided to you for your comprehensive-annual instructional program review write ups and how it has been calculated. 7

#1New and Replacement Positions (State) This data element represents the combined new and replacement jobs for the State of Hawaii in your trade, projected for the period for AY0607, for AY0708 and for AY0809. Economic Modeling Specialists Inc (EMSI) compiles data based on Standard Occupational Codes (SOC) that the college has linked to the instructional program.. From their website, “…EMSI specializes in reports that analyze and quantify the total economic benefits of community and technical colleges in their region, and also creates data-driven strategic planning tools that help colleges maximize their impact through labor market responsiveness…” 8

#2 New and Replacement Positions (County prorated) This data element represents the combined new and replacement jobs for the County of Hawaii in your trade, projected for the period for AY0607, for AY0708 and for AY0809. Economic Modeling Specialists Inc (EMSI) compiles data based on Standard Occupational Codes (SOC) that the college has linked to the instructional program. For multiple community colleges in the same county the number of positions has been prorated based upon the proportion of majors at each CC. 9

#3Number of Majors Since the number of majors in 0809 is an annual count,.5 majors were assigned for each major in both fall and spring, then added up for the year. This method was preferred over counting unduplicated majors. Note that the 2 previous reporting periods are Fall majors only. 10

#4SSH Program majors in Program Classes This is the sum of all student semester hours taken by program majors in our locally defined program classes for the academic year For Practical Nursing this includes SSH earned in Summer. Excludes Directed Studies (99 series). Includes Cooperative Education. Not sure what your program classes are? Click here to find out: Program Course Listing Program Course Listing 11

#5SSH Non-Majors in Program Classes This is the sum student semester hours taken by non- program majors in our locally defined program courses. For Practical Nursing this includes SSH earned in Summer. Excludes Directed Studies (99 series). Includes Cooperative Education (93 series). 12

#6SSH in All Program Classes The sum of student semester hours taken by all students in our locally defined program courses. For Practical Nursing this includes SSH earned in Summer. Excludes Directed Studies (99 series). Includes Cooperative Education (93 series). 13

#7 FTE Enrollment in Program Classes This is the sum of student semester hours taken by all students in your program classes (#6) / 30 for the academic year Full time equivalent (FTE ) is calculated as15 credits per term For Fall 06 and Fall 07 it is the sum of student semester hours taken by all students in your program classes (#6) / 15. For Practical Nursing this includes SSH earned in Summer. 14

#8Total Number of Classes Taught This is the number of program classes (actual sections) taught in the program. For Practical Nursing this includes classes taught in Summer. The number of classes taught excludes Directed Studies courses (99, 099, 199, 299) but includes Cooperative Ed courses (93, 093, 193, 293) 15

Determination of program’s health based on demand This year the system office will calculate and report health calls for all instructional programs using academic year 0809 data. The following instructions illustrate how those calls are made. Program Demand is determined by taking the number of majors (#3) and dividing them by the number of New and Replacement Positions by County (#2). The following benchmarks are used to determine demand health: Healthy: Cautionary:.5 – 1.49; 4.1 – 5.0 Unhealthy: 5.0 Finally, an Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy 16

#9Average Class Size Average class size is total student registrations in program classes divided by the total number of classes taught (#8) at time of census. This excludes Directed Studies courses (99, 099, 199,299) but includes Cooperative Ed courses (93, 093, 193, 293) 17

#10Fill Rate Class fill rate is total student registrations in program classes (number of seats filled) divided by the maximum enrollments (number of seats offered). Taken at Fall and Spring census for

#11FTE BOR Appointed Faculty FTE of BOR Appointed Program Faculty is the sum of appointments to your program (1.0, 0.5, etc) excluding Lecturers and other non-BOR appointees. Remember that these are positions that were appointed to your program— whether or not these people are actually teaching classes. This information now comes directly from system HR. If this is not correct for your program we can work to get it updated in the HR system. Click here to find out who the BOR Appointed Program Faculty are for your program: Instructional Program BOR Appointments Instructional Program BOR Appointments 19

#12 Majors to FTE BOR Appointed Faculty The number of majors (from data element #3) divided by the number of FTE BOR Appointed Program Faculty (from data element #11). Note: this is not all students taking your classes…just the majors. 20

#13aAnalytical FTE Faculty (Workload) Analytical FTE Faculty is a workload measure and is determined by summing the semester hours taught by instructors in their program courses, and then dividing by 15 for Fall semester s (or 27 for the year) for the full time equivalent value. Analytic FTE Faculty includes the semester hours taught by both Faculty and Lecturers. Therefore, it can be used to compare to the value of (#11) FTE BOR Appointed faculty, to highlight program offerings being covered by lecturers. 21

#13Majors to Analytic FTE Faculty Total number of majors (from data element #3) divided by Analytic FTE faculty (from data element #13a) for the academic year

#13c Analytic FTE 12 cr. Starting in Fall 2007 we added a data element to the review for programs that operate on contact hours instead of credit hours. For these programs we will calculate workload faculty fte by dividing credits taught by 12 instead of 15 per Fall semester, and dividing by 21.6 for If you do not have a #13c in the data set provided to you, please ignore this additional detail. It was only added to the data sheets for programs operating on contact hours. The contact hours calculation for Faculty FTE is not recognized by the system office so we are adding it to the program review as it makes more sense for programs operating on contact hours. It is also used in the calculation of data elements 13c, 14, 14a, and15. Programs identified as operating on contact hours are: AG, ABRP, AEC, AMT, CARP, DISL, DMA, EIMT, ET, FSER, MWIM, NURS, PRCN, CHO, and TEAM. (note: AEC newly added this year) This data element is added as a supplement to the instructional program review for Hawaii Community College. 23

#13bMajors to Analytic FTE 12 cr. Number of majors (#3) in your program divided by the contact hour calculation for analytic fte from previous slide. 24

#14Overall Program Budget Allocation The overall program budget allocation = General Funded Budget Allocations + Special/Federal Budget Allocations (grant dollars). The general funded budget allocation = personnel costs + b budget With the exception of the 3 Nursing programs, we calculate personnel costs for AY09 by multiplying the Analytic FTE Faculty, which includes lecturers, by the AY 2009 UHPA faculty rank 4 rate per credit hour value of $1676, then by 30 credits. For the 3 Nursing programs, we calculate personnel costs for AY09 by multiplying the Analytic FTE Faculty, which includes lecturers, by the AY 2009 UHPA faculty rank 5 rate per credit hour value of $1879, then by 30 credits. 25

#15Cost per SSH This is the cost to run your program based on student semester hours. Costs come from Program Budget Allocation (data element #14) divided by student semester hours for all students in program classes (from data element #6) 26

#16Number of Low Enrolled (<10) Classes This is the number of Program classes (actual sections) taught (#8) with 9 or fewer students enrolled at census. Excludes Directed Studies (99 series) Includes Coop Ed (93 series) 27

Determination of program’s health based on efficiency This year the system office will calculate and report health calls for all instructional programs using 0809 data. The following instructions illustrate how those calls are made. Program Efficiency is calculated using 2 separate measures…Fill rate (#10), and Majors to FTE BOR Appointed Faculty (#12). The following benchmarks are used to determine health for Fill Rate: Healthy:75 – 100% Cautionary:60 – 74% Unhealthy:< 60% An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 28

Determination of program’s health based on efficiency cont… The following benchmarks are used to determine health for Majors/FTE BOR Appointed Faculty : Healthy: Cautionary:30 – 60; Unhealthy:61 +; 6 or fewer An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy Finally, average the 2 overall health scores for Class fill rate and Majors/FTE BOR Appointed Faculty then use the following rubric: = Healthy.5 - 1= Cautionary 0= Unhealthy 29

#17Successful Completion (Grades>=C) The percentage of successful completions is new to our program review beginning in academic year This is the percentage of students in program courses at Fall and Spring census who at the end of the semester have earned a grades equivalent to “C” or higher. 30

#18Withdrawals (grade = W) The percentage of withdrawals is new to our program review beginning in academic year This is the percentage of students actively enrolled as of Fall and Spring census, who at the end of the semester have a grade of “W”. 31

#19Persistence Fall to Spring Fall to Spring Persistence is the number of your program majors (#3), who at subsequent spring semester are enrolled and are still majors in the program at time of census. Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 =.6774 or 67.74% 32

#20Unduplicated Degrees/Certificates Awarded Unduplicated Degrees/Certificates Awarded is new to our program review beginning in academic year This is a count of your program majors that received either a degree or a certificate, and is the measurement used in the determination of the effectiveness health call. Unlike other student data that is tracked within the academic year, degrees/certificates are awarded within the fiscal year. The UH Fiscal year begins July 1 st and runs through June 30th. The system office tries to send us your data by August 15 th. Since the degrees/certificate data has not been updated in the ODS by that time we would normally not see degrees/certificates for To account for this, a decision was made that the degrees/certificates data for 0809 would come from summer of the previous year. In other words, the academic year data you see for 0809 is actually summer 08, fall 08, and spring 09—not the degrees/certificates earners in the 0809 traditional academic year (fall – spring – summer). 33

#20aNumber of Degrees Awarded This is a count of credentials and shows duplicate credentials received by the same student. It is the number of degrees conferred by your program majors. Again, for 0809 it is the number of degrees conferred in the fiscal year, Summer 08, Fall 08, and Spring

#20bCertificates of Achievement Awarded The number of certificates of achievement conferred by your program majors in the fiscal year. For 0809 it is the number of CA certificates conferred in the fiscal year, Summer 08, Fall 08, and Spring

#20cAcademic Subject Certificates Awarded The number of academic subject certificates conferred by your program majors in the fiscal year. For 0809 it is the number of ASC certificates conferred in the fiscal year, Summer 08, Fall 08, and Spring

#20d Other Certificates Awarded The number of certificates conferred by your program majors in the fiscal year except CA, and ASC. For 0809 it is the number of certificates (except CA, and ASC ) conferred in the fiscal year, Summer 08, Fall 08, and Spring

#21Transfers to UH 4-yr programs This is the number of your program majors (#3) with home campus at UH Manoa, UH West Oahu, or UH Hilo for the first time in Fall 08… Who, prior to Fall 08 had a UH Community College as home campus. A student is included in the count of program transfers in as many programs in which they have been a major at the college. 38

#21aTransfers with degree from program Students included in #21 (transfers to UH 4-year) who have received a degree from the community college program prior to transfer. 39

#21bTransfers without degree from program Students included in #21 (transfers to UH 4-year) who have not received a degree from the community college program prior to transfer. 40

Determination of program’s health based on effectiveness This year the system office will calculate and report health calls for all instructional programs using academic year 0809 data. The following instructions illustrate how those calls are made. Program Effectiveness is calculated using 3 separate measures: Unduplicated Degrees/Certificates Awarded (#20) / Majors (#3), Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2), and Persistence Fall to Spring (#19). The following benchmarks are used to determine health for Unduplicated Degrees/Certificates Awarded per major : Healthy:> 20% Cautionary: % Unhealthy:< 15% An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy 41

Determination of program’s health based on effectiveness cont… The second measure used to determine health is Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2). The following benchmarks are used to for this measure: Healthy:.75 – 1.5 Cautionary: and 1.5 – 3.0 Unhealthy: 3.0 An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 42

Determination of program’s health based on effectiveness cont… The third measure used to determine health is Persistence (Fall to Spring) (#19). The following benchmarks are used for this measure: Healthy: % Cautionary: % Unhealthy:< 60% An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 43

Determination of program’s health based on effectiveness cont… You should now have a value of zero, one, or two for each of the 3 effectiveness measures. The process of determining the Effectiveness health call score contains the following 3 steps: Step #1: Add up all 3 Overall Category Health scores for the effectiveness measures. (the zero’s, one’s and two's you assigned earlier) Step #2: Determine the effectiveness category health call range where: = Healthy = Cautionary = Unhealthy Step #3: Now use the scoring rubric below to determine the effectiveness health call score: (for example: you had a healthy 5 in the previous step you would assign it a healthy 2 here) 2 = Healthy 1 = Cautionary 0 = Unhealthy 44

Determination of program’s overall health You should now have a value of zero, one, or two for each of the 3 program health calls; Demand, Efficiency, and Effectiveness. Simply add those 3 values together and use the Scoring Range Rubric below to determine the overall health of your program = Healthy = Cautionary = Unhealthy 45

#22Number of Distance Education Classes Taught The number of distance education classes taught is new to our program review beginning in academic year Measures the number of classes taught using the mode of delivery, “Distance Completely On-Line” (DCO). If the method of instruction for teaching your class was determined to be Distance Education, the college will indicate the mode of delivery, in this case Distance Completely On-Line. 46

#23Enrollment Distance Education Classes The enrollment in distance education classes is new to our program review beginning in academic year Distance Completely On-Line Classes As of Fall and Spring census, the number of students actively enrolled in all classes owned by the program and identified as Distance Completely On-Line. 47

#24Fill Rate The fill rate for distance education classes is new to our program review beginning in academic year Distance Completely On Line Classes Fill rate is total student registrations in DCO program classes (number of seats filled) divided by the maximum enrollments (number of seats offered). Taken at Fall and Spring census for

#25Successful Completion (Grade>=C) The percentage of successful completions in distance education classes is new to our program review beginning in academic year Distance Completely On Line Classes This is the percentage of students in DCO program courses at Fall and Spring census who at the end of the semester have earned a grades equivalent to “C” or higher. 49

#26Withdrawals (Grade=W) The percentage of withdrawals in distance education classes is new to our program review beginning in academic year Distance Completely On Line Classes This is the percentage of students actively enrolled in DCO classes as of Fall and Spring census, who at the end of the semester have a grade of “W”. 50

#27Persistence (Fall to Spring not limited to Distance Ed) The persistence fall to spring of distance education classes is new to our program review beginning in academic year Distance Completely On Line Classes Fall to Spring Persistence is the number of your program majors (#3), who in Fall are in DCO classes and at subsequent spring semester are still majors in the program at time of census. Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 =.6774 or 67.74% 51

#28Perkins Core Indicator: Technical Skills Attainment (1P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 0708 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Technical Skills Attainment is calculated by: Number of concentrators who have a cumulative GPA> or = 2.00 in Career and Technical Education courses and who have stopped program participation in the year reported. Number of concentrators who have a cumulative GPA> or = 2.00 in Career and Technical Education courses and who have stopped program participation in the year reported. Number of concentrators who have stopped program participation in the year reported. 52

#29Perkins Core Indicator: Completion (2P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 0708 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Completion is calculated by: Number of concentrators who received a degree or certificate in a Career and Technical Education program and who have stopped program participation in the year reported. Number of concentrators who received a degree or certificate in a Career and Technical Education program and who have stopped program participation in the year reported. Number of concentrators who have stopped program participation in the year reported. 53

#30Perkins Core Indicator: Student Retention or Transfer (3P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 0708 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Student Retention or Transfer is calculated by: Number of concentrators in the year reported who have not completed a program and who continue postsecondary enrollment or who have transferred to a baccalaureate degree program. Number of concentrators in the year reported who have not completed a program. 54

#31Perkins Core Indicator: Student Placement (4P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 0708 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Student Placement is calculated by: Number of concentrators in the year reported (previous Perkins year) who have stopped program participation and who are placed or retained in employment, military service, or an apprenticeship program within UI quarter following program completion. Number of concentrators in the year reported (previous Perkins year) who have stopped program participation 55

#32Perkins Core Indicator: Nontraditional Participation (5P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 0708 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Nontraditional Participation is calculated by: Number of participants from underrepresented groups who participated in a program that leads to employment in nontraditional fields during the reporting year. Number of participants from underrepresented groups who participated in a program that leads to employment in nontraditional fields during the reporting year. Number of participants who participated in a program that leads to employment in nontraditional fields during the reporting year. 56

#33Perkins Core Indicator: Non-Traditional Completion (5P2) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 0708 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Non-Traditional Completion is calculated by: Number of concentrators from underrepresented gender groups who completed a program that leads to employment in nontraditional fields during the reporting year. Number of concentrators from underrepresented gender groups who completed a program that leads to employment in nontraditional fields during the reporting year. Number of concentrators who completed a program that leads to employment in nontraditional fields during the reporting year. 57

Comprehensive & Annual Program Review Timeline Develop presentation and training materials Provide Annual Program Review Training to Campus Package and Delivery of all Program & Unit Reviews to System Office Post all Program & Unit Reviews to Assessment Website Aug 26th All instructional program and unit reviews (Comprehensive and Annual) due to Interim VCAA by by EOB Wednesday December 2 nd. Deliver all 3 years of data to instructional programs Collect and deliver data to Student Support Services Build out 2009 program review page on the assessment website Plan this year’s program review based on suggested improvements from last year’s review Create and post a “Lessons Learned” document to capture year over year improvements to existing PR process Determine what templates will be used and that they are linked to a common website (UHCC). Review all PR documentation and work with Trina and AC to update Determine what data needs to be added to our instructional program review beyond what system delivers Develop and administer PR Process Improvement Survey Work with AC and Interim VCAA to determine what needs to be taken back to UHCC IPRC Receive 1 years data from system office and begin collecting other 2 years Aug 15th Dec 15th Oct 16 th, 30th Aug 28th Jul - Sept Jul 28th Jan 5th Dec 20th Jan 25th Jul - AugAug - Oct Jul - Sept Dec 2ndOct 14th Oct 23rd 58

AY2009 Comprehensive & Annual Review Process Step 1Write your instructional program review using the appropriate template. Step 2Send your documents (one Word document per review) to Interim VCAA Noreen Yamane by no later than end of business, Wednesday December 2nd, Step 3Interim VCAA will ensure that all required documents have been received and that they are adequate. Step 4Interim VCAA will forward all approved reviews to the Institutional Research Office for further processing. Step 5The Annual reviews will be appropriately packaged and sent to the System Office for review by the UH Board of Regents. Comprehensive reviews will be forwarded along as appropriate following CERC guidelines. Step 6All reviews will finally be converted to PDF and posted to the Assessment Web Site. 59

Questions? The intention of this presentation was to provide a single source for all of the documentation related to the Comprehensive & Annual Program Review process. I have linked all of the documents you should need directly into this presentation. For additional details on instructional data elements see: UHCC Annual Instructional Program Review Glossary UHCC Annual Instructional Program Review Glossary If you need more information on this process please feel free to contact me: Shawn Flood Shawn Flood Shawn FloodShawn FloodMahalo! 60