Presentation is loading. Please wait.

Presentation is loading. Please wait.

AY 2013 CTE Instructional Program Review Process Data Description Process – Timeline Rev. 10-18-13.

Similar presentations


Presentation on theme: "AY 2013 CTE Instructional Program Review Process Data Description Process – Timeline Rev. 10-18-13."— Presentation transcript:

1 AY 2013 CTE Instructional Program Review Process Data Description Process – Timeline Rev. 10-18-13

2

3 Covered in today’s session… Quick review of Program and Unit Review website Navigating the Annual Reports from Program Data (ARPD) website Overview of our local program and unit review process on campus What are we doing to improve our process? What is different this year? Terminology used in the review Data definitions Timeline Assessment with James Kiley New Annual Review and Budget Process with Dean of CTE Joyce Hamasaki New CERC/Comprehensive Review Process with VCAA Joni Onishi 3

4 Purpose The purpose of this presentation is to describe the process we follow for our local Comprehensive Program Reviews, the system required Annual Program Reviews, and to provide definitions of the data used in the reviews. We have been asked to produce an annual program review for each and every one of our instructional programs and non-instructional units. They are required of each community college in the system and will be taken to the U of H Board of Regents for their purview. You will need to complete a comprehensive review this year this year if you are scheduled for one. Every program and unit is required to complete a comprehensive review once within 5 years time. Click on the link below to find out if you are scheduled for a comprehensive review this year. Comprehensive Program-Unit Review Cycle and Schedule Comprehensive Program-Unit Review Cycle and Schedule 4

5 Reason for Program Review The review of a program should be an on-going, year-round, reflective process. The review of a program should be an on-going, year-round, reflective process. Program review processes assure educational quality and help programs to evaluate and improve their services. Program review is an opportunity for self-study, self-renewal, and an opportunity to identify the need for improvement. Your program review may be one of the few opportunities you have to showcase the accomplishments of your program. Take this opportunity to shine. A robust program review process is one mechanism available to the college to improve student success. 5

6 What are we doing to improve our program review process? Upon conclusion of every program/unit review cycle, the IR Office takes extra care to ensure that we are improving our program/unit review process on campus. This is accomplished by sending out questionnaires specific to the groups, and then meeting with those groups across campus and collecting their feedback. The feedback is reviewed and an action plan is put into place, which is used in the planning phase for our next review cycle. Your suggestions, and the actions taken for improving this process, are then published to the Program Review website and have been linked here for your convenience. 2012 Program-Unit Process Improvement Summary 2012 Program-Unit Process Improvement Summary To drive accountability in the organization, the Vice Chancellor for Academic Affairs will ensure that all of the work we commit to as part of this process, is completed every year. Based on the feedback we received from everyone last year from our program-unit review process improvement focus groups, the following changes have been made and have been incorporated into the planning of this year’s review: 6

7 In order to assure the best possible communication and attendance for our annual training, the IR Office developed the Program-Unit Review Campus Communication Plan. We found out that information about our program and unit review process on campus was not always getting to the people that need it, availability of faculty, lecturers and staff were not taken into consideration when scheduling meetings/training, and using email alone was not sufficient to reach everyone on campus. Program-Unit Review Campus Communication PlanProgram-Unit Review Campus Communication Plan The new communication plan includes: managing a scheduling request for meetings, using the campus-wide email distribution to include faculty, lecturers and staff, and a hardcopy reminder will be printed off and placed in division/department mailboxes and posted in break areas. Additionally, the Vice Chancellor for Academic Affairs will work with the Admin Team and the DC’s to ensure that we are adequately communicating program and unit review activities across the campus. 7 What else are we doing to improve our program review process?

8 There were some issues that came up last year from folks that were unable to find their assessments, which they needed in order to write their reviews. It turns out that the assessments had all been moved from the assessment website to the Intranet, but instructions were never sent out to communicate the change. All assessments can now be found in the assessment folder on the Intranet. Click here to log in and view your assessments: Intranet Assessments Intranet AssessmentsIntranet Assessments Your suggestions for improving our program review process at the system level were taken to the Instructional Program Review Committee (UHCC IPRC) by our UHCC IPRC Representatives on campus. Results of that meeting are published here: 2012 UHCC IPRC Responses 2012 UHCC IPRC Responses2012 UHCC IPRC Responses A special “Mahalo” goes out to our local UHCC IPRC representatives for helping us improve this important process! A special “Mahalo” goes out to our local UHCC IPRC representatives for helping us improve this important process! 8

9 What else are we doing to improve our program review process? The Academic Support Unit Review online tool and glossaries have been greatly improved since last year, based in part on the feedback that was provided. An additional step to our Comprehensive Program/Unit Review Process document was added to ensure that VC’s and Directors update the schedule for both comprehensive reviews and annual reviews on campus every year. This work to be completed in June before we begin the new cycle. An additional step to our Comprehensive Program/Unit Review Process document was added to invite the IR to the first scheduled Admin meeting in September to communicate changes. This is the improved process we are using this year: Comprehensive Program-Unit Review Process Comprehensive Program-Unit Review ProcessComprehensive Program-Unit Review ProcessComprehensive Program-Unit Review Process 9

10 What is different this year? Remember: Joni will find someone to copy your template information into the online ARPD tool. Just fill out the template you are required to submit this year! The following data elements are all new this year:  Number of majors Native Hawaiian  Fall Full-Time  Fall Part-Time  Fall Part-Time that are full-time in system  Spring Full-Time  Spring Part-Time  Spring Part-Time that are full-time in system  Persistence Fall to Fall Performance Funding Performance Funding  Number of Degrees and Certificates  Number of Degrees and Certificates Native Hawaiian  Number of Degrees and Certificates STEM  Number of PELL Recipients  Number of Transfers to UH 4-yr 10

11 Navigating the ARPD Web Submission Tool If you have attended the training session in-person today you can skip the next few slides (go to slide #20) that deal with Navigating the ARPD site. I’ve included the slides here for those that may not have been able to join us today. The Annual Reports of Program Data Web Submission Tool (or ARPD for short), is a repository for Annual program and unit reviews for: Instruction, the Academic Support Unit, and Student Support Services. Much of the data that you will need to complete your review has been provided by the Office of the Vice President for Community Colleges. The ARPD is a home-grown tool, developed in-house, and was developed specifically to meet the needs of the community college system. 11

12 Navigating the ARPD Web Submission Tool cont… Your programs data table is available on-line within the tool by August 15 th every year. You should have everything you need to begin writing your review within the web submission tool. Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. Also plan to spell check and save your review in Word before starting. One of the nicest features about the ARPD is that you can go back and look at reviews from previous years. You can also enhance your own review by leveraging off similar reviews at other institutions. 12

13 Navigating the ARPD Web Submission Tool continued… UHCC Annual Report of Program Data Web Submission Tool UHCC Annual Report of Program Data Web Submission Tool Begin by clicking on the link above. To enter information on your review, click on the button in the lower part of the screen called, “2013 Instructional Submission”. You will be asked to log in by typing in your UH username and password, in order to get to the web submission site. The default will take you to the “Status” tab, where you will be able to view whoever was the last user to modify the information in your review. Clicking on the “Users” tab will take you to a screen that shows all of the people that have permission to update your program. For each program there is a list of people with specific roles: Program Coordinator, Div./Dept. Chair, and Dean. The VCAA determines who can update your program. If there is someone you’d like to add to the list, contact the VCAA with your request. You can enter the information about your program in any order you wish, but moving from left to right across the tabs at the top of the screen follows the same logical path we used when we were filling out the templates in the past. Start by clicking on the “Analysis” tab. On the Analysis screen you can either go in and just preview what has already been inputted by clicking on the “Preview” button, or you can go to the edit screen and begin entering or updating information. 13

14 Navigating the ARPD Web Submission Tool continued… On the Analysis screen begin editing by going to the bottom of the page, under the data sheet. Click the edit button and scroll down. There are 3 sections for you to edit here: Analysis of your Program, Action Plan, and Resource Implications. Simply click on the “Edit” link and you will be taken to a screen very similar to MS Word, where you can begin typing in your analysis. (Note the Save button in the left hand corner!) Note that in the Action Plan section you will need to include action plans for any of your Perkins Core Indicators, where the program did not meet the goal. Check your data sheet for this. Also very important this year…if you are requesting additional people, services, or equipment for your program, you will need to make the justification in the “Resource Implications” section. Asks for your program are no longer part of the comprehensive review as they have been in the past. Now click on the “Description tab” and input the year and web address of your last comprehensive review. You should be able to copy and paste the link right off the program-unit review website for the year of your last review. Finish this tab off by typing in a brief description of your program and mission. Now, click on the “P-SLOs tab” and go to the next slide… 14

15 What to enter on the “P-SLO” tab in ARPD Web Submission Tool Indicate which PLOs were assessed during the reporting year’s assessment(s). Evidence of Industry Validation Provide documentation that the program has submitted evidence and achieved certification or accreditation from an organization granting certification in an industry or profession. If the program/degree/certificate does not have a certifying body, the recommendations for, approval of, and/or participation in, assessment by the program’s advisory committee/board can be submitted. Expected Level of Achievement Describe the different levels of achievement for each characteristic of the learning outcome(s) that were assessed. What represented “excellent,” “good,” “fair,” or “poor” performance using a defined rubric and what percentages were set as goals for student success (for example: “85% of students will achieve good or excellent in the assessed activity.”) Courses Assessed List the courses assessed during the reporting period. Assessment Strategy/Instrument Describe what, why, where, when, and from whom assessment artifacts were collected. Results of Program Assessment The % of students who met the outcome(s) and at what level they met the outcome(s). Other Comments Include any information that will clarify the assessment process report. Next Steps Describe what the program will do to improve the results. "Next Steps" can include revision to syllabi, curriculum, teaching methods, student support, and other options. 15

16 What to enter on the “Cost Per SSH” tab in ARPD Web Submission Tool There are 4 different values that need to be added on the Cost Per SSH screen in ARPD. Basically, all of the fund amounts that will be entered are used in the calculation for the cost per SSH for your program. The costs that need to be entered are: General Funds Federal Funds Other Funds Tuition and Fees The definitions for what comprises each of these funds, are laid out in detail in the data definitions section of this presentation, so will not be duplicated here. Joni will be uploading all of the budget information you need this year. Once the values are loaded and saved in the tool, the values for “Total Funds” and “Cost per SSH” will auto-calculate into your datasheet. 16

17 What to enter on the “External” tab in ARPD Web Submission Tool The “External” screen is intended for programs that utilize external licensures. Currently, the only program at HawCC with external licensures is the Nursing program. If you are not in the Nursing program you can skip this tab/screen altogether because the first question is, “Does this program utilize external licensures?” The radio button defaults to “No”. (mighty thoughtful programming, I’d say ) If you are in the Nursing program please click the “Yes” radio button to answer the question and then enter the Number sitting for exam, and the Number passed. Do this for each program where you utilize external licensures. The percent passed will auto-calculate into your datasheet when you click the “Save External Data” button. The Vice President of Community College’s office collects this data as part of our annual Graduate-Leaver reporting as well as the UH system, “Measuring our Progress” report. 17

18 What to enter on the “Capacity” tab in ARPD Web Submission Tool The “Capacity” screen is only intended for programs that have an externally mandated capacity. The following criteria is used as an alternate measure for the Student/Faculty Ratio measure within the Efficiency Health call: “If your program has an externally mandated (e.g. professional accreditation or licensing) capacity of less than 16 students per faculty, the program may be eligible for the alternative efficiency health call calculation.” If your program fits the criteria listed above, AND your Efficiency Health call is other than Healthy, AND your efficiency health call can be improved by using the alternate method… Please contact: Cheryl Chappell-Long Director Academic Planning, Assessment, and Policy Analysis Phone: 808-956-4561 Email: cchappel@hawaii.edu cchappel@hawaii.edu 18

19 “Help” tab in ARPD Web Submission Tool The “Help” screen is a very useful resource for some of the documentation needed to support your program within the review process. It contains all of the glossaries and health call scoring rubrics for each year we have used the online tool. 19

20 Process for Completing an Instructional Program Review COMPREHENSIVE REVIEWS If you are on the schedule to complete a comprehensive review this year, follow this simple 2- step process: If you are on the schedule to complete a comprehensive review this year, follow this simple 2- step process: Step 1: Complete your review in the ARPD online submission tool. Step 2: Move to the Comprehensive Review Process for further instructions. ANNUAL REVIEWS If you are on the NOT on the schedule to complete a comprehensive review this year, follow this simple 2-step process: Step 1: Complete your review in the ARPD online submission tool. Step 2: Move to the Annual Review & Budget Process for further instructions. The comprehensive review process, and annual review and budget process, and their associated templates, are brought to you this year by the VCAA and Interim Dean of CTE Programs. The template and instructions should be part of their presentation today. The comprehensive review process, and annual review and budget process, and their associated templates, are brought to you this year by the VCAA and Interim Dean of CTE Programs. The template and instructions should be part of their presentation today. 20

21 Process Owners The graphic below depicts who is responsible for each part of the program review process—from ARPD through CERC. For more information please use the contact list below. Contact List Assessment = James Kiley934-2649 IR = Shawn Flood934-2648 Dean CTE = Joyce Hamasaki934-2522 VCAA = Joni Onishi934-2514 21 ASSESSMENT IRDean CTE Publish Reviews VCAA ARPD Web Submission Annual Review & Budget Process/Template Done! ARPD Web Submission Comprehensive Review Process/Template CERC Annual Review Comprehensive Review

22 Terminology / Timing The Census freeze event is the fifth Friday after the first day of instruction. The End of semester (EOS) freeze event is 10 weeks after the last day of instruction. Degrees are conferred in the “Fiscal year”. The fiscal year value represents the ending of that fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012-2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters. “Home Institution” is the campus where the student was admitted. “Program Year” listed on your data sheet for “12-13” represents your programs data in semesters: Summer 2012, Fall 2012, and Spring 2013. Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting the data from the same place and at the same snapshot in time (this is a good thing). 22

23 #1New and Replacement Positions (State) Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at state level. Compiles data based on Standard Occupational Classification (SOC) codes aligned to the program’s Classification of Instructional Programs (CIP) codes. Data based on annual new/replacement position projections as of Spring 2013 State position numbers are not pro ‐ rated. AY 2013 HawCC CIP Code Listing AY 2013 HawCC CIP Code Listing 23

24 #2 New and Replacement Positions (County prorated) Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at county level. Compiles data based on Standard Occupational Classification (SOC) codes aligned to the program’s Classification of Instructional Program (CIP) codes. Note: It is possible for the number of new and replacement positions in the county to be higher than the state if the projection in other counties is for a loss of new and replacement positions. County data pro ‐ rated to reflect number of programs aligned to the SOC code and weighted by number of majors in each program/institution for programs that share SOC codes. Data based on annual new/replacement position projections as of Spring 2013. 24

25 #3Number of Majors Count of program majors who are home ‐ institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. This is an annual number. Programs receive a count of.5 for each term (fall and spring) within the academic year that the student is a major. A maximum count of 1.0 (one) for each student. 25 #3aNumber of Majors Native Hawaiian Count of program majors who are Native Hawaiian and home ‐ institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. This is an annual number. Programs receive a count of.5 for each term (fall and spring) within the academic year that the Native Hawaiian student is a major. A maximum count of 1.0 (one) for each student.

26 #3bFall Full-Time Percentage of majors (#3) enrolled in 12 or more credits in the college in the reporting Fall semester. 26 #3cFall Part-Time Percentage of majors (#3) enrolled in less than12 credits in the college in the reporting Fall semester. #3dFall Part-Time who are Full-Time in System Percentage of majors in #3c (enrolled in less than 12 credits in the reporting Fall semester in the institution) who are enrolled in credits in other UH institutions where their total number of credits enrolled in the UH System is equal to or greater than 12.

27 27 Percentage of majors enrolled in 12 or more credits in the reporting Spring semester at the institution. #3eSpring Full-Time #3fSpring Part-Time Percentage of majors enrolled in less than 12 credits at the institution in the reporting Spring semester. #3gSpring Part-Time who are Full-Time in System Percentage of majors in #3f (enrolled in less than 12 credits in the reporting Spring semester in the institution) who are enrolled in credits in other UH institutions where their total number of credits enrolled in the UH System is equal to or greater than 12.

28 #4SSH Program majors in Program Classes The sum of Fall and Spring Student Semester Hours (SSH) taken by program majors in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, Summer SSH are included. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program. Not sure what your program classes are? Click here to find out: Courses Taught Aligned to Instructional Programs Courses Taught Aligned to Instructional ProgramsCourses Taught Aligned to Instructional ProgramsCourses Taught Aligned to Instructional Programs 28

29 #5SSH Non-Majors in Program Classes The sum of Fall and Spring Student Semester Hours (SSH) taken by non ‐ program majors (not counted in #4) in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, Summer SSH are included. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 29

30 #6SSH in All Program Classes The sum of Fall and Spring Student Semester Hours (SSH) taken by all students in classes linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, Summer SSH are included. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 30

31 #7 FTE Enrollment in Program Classes Sum of Student Semester Hours (SSH) taken by all students in classes linked to the program (#6) divided by 30. Undergraduate, lower division Full Time Equivalent (FTE) is calculated as 15 credits per term. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, summer SSH are included. 31

32 #8Total Number of Classes Taught Total number of classes taught in Fall and Spring that are linked to the program. Includes Summer classes if year ‐ round attendance is mandatory. Concurrent and Cross listed classes are only counted once for the primary class. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 32

33 CTE Program Scoring Rubric Definitions Your program health is determined by 3 separate types of measures: Demand, Efficiency, and Effectiveness. This slide explains why these measures were chosen to determine program health. Your program health is determined by 3 separate types of measures: Demand, Efficiency, and Effectiveness. This slide explains why these measures were chosen to determine program health. Demand: A seeking or state of being sought after. i.e. your programs ability to attract new students every year based on your offering. i.e. your programs ability to attract new students every year based on your offering. Efficiency: Acting or producing effectively with a minimum of waste, expense, or unnecessary effort. i.e. your programs ability to use its resources in the best possible way. Effectiveness: Stresses the actual production of, or the power to produce an affect. i.e. your programs ability to produce the desired result. 33

34 Determination of program’s health based on demand This year the system office will calculate and report health calls for all instructional programs using academic year 2013 data. The following instructions illustrate how those calls are made. Program Demand is determined by taking the number of majors (#3) and dividing them by the number of New and Replacement Positions by County (#2). The following benchmarks are used to determine demand health: Healthy:1.5 - 4.0 Cautionary:.5 – 1.49; 4.1 – 5.0 Unhealthy: 5.0 Finally, an Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy 34

35 #9Average Class Size Total number of students actively registered in Fall and Spring program classes divided by classes taught (#8). Does not include students who have already withdrawn from the class by Census. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data. Includes Cooperative Education (93 series) as there is a resource cost to the program. 35 #10Fill Rate Total active student registrations in program classes (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Captured at Census and excludes students who have already withdrawn (W) at this point.

36 #11FTE BOR Appointed Faculty Sum appointments (1.0, 0.5, etc.) of all BOR appointed program faculty (excludes lecturers and other non BOR appointees). Uses the “hiring status” of the faculty member – not the teaching/work load. Uses the Employing Agency Code (EAC) recorded in the Human Resources (HR) database to determine faculty’s program home. Data provided by UH Human Resources Office as of 10/31/2012. If your FTE BOR faculty count is off for your program, contact the VCAA. Click here for the count of BOR Appointed Program Faculty in your program: 2013 BOR Appointed Program Faculty 2013 BOR Appointed Program Faculty 36

37 #12 Majors to FTE BOR Appointed Faculty Number of majors (#3) divided by sum appointments (#11) (1.0, 0.5, etc.) of all BOR appointed program faculty. Data shows the number of student majors in the program for each faculty member (25 majors to 1 faculty shown as “25”) 37

38 #13Majors to Analytic FTE Faculty Number of majors (#3) divided by number of Analytic FTE faculty (13a). 38 #13aAnalytical FTE Faculty (Workload) Calculated by sum of Semester Hours (not Student Semester Hours) taught in program classes divided by 27. Analytic FTE is useful as a comparison to FTE of BOR appointed faculty (#11). Used for analysis of program offerings covered by lecturers.

39 #14Overall Program Budget Allocation The overall program budget allocation = General Funded Budget Allocations (14a) + Special/Federal Budget Allocations (14b) + Tuition and Fees (14c) + Other fees The overall program budget allocation is automatically calculated when you enter your general funded budget allocation, special/federal budget allocation, other funds, and/or tuition and fees, into the online tool tab called, “Cost per SSH.” Again, Joni will upload the data for you this year. The overall program budget allocation is to be determined by the College using these guidelines from VCAA/DOI/ADOI and should include: Salaries (general funds, special funds, etc.), overload, lecturers, costs for all faculty and staff assigned to the program, supply and maintenance, amortized equipment, and tuition and fees. 39

40 #14aGeneral Funded Budget Allocation The general funded budget allocation = actual personnel costs + b budget expenditures 40 #14bSpecial/Federal Budget Allocation The dollars from Federal grants #14cTuition and Fees The amount collected for tuition and fees in the 2013 academic year.

41 #16Number of Low Enrolled (<10) Classes Classes taught (#7) with 9 or fewer active students at Census. Excludes students who have already withdrawn (W) at this point. Excludes Directed Studies (99 series). Includes Cooperative Education (93 series) as there is a resource cost to the program. 41 Overall Program Budget Allocation (#13) divided by SSH in all program classes (#5) #15Cost per SSH

42 Determination of program’s health based on efficiency This year the system office will calculate and report health calls for all instructional programs using AY 2013 data. The following instructions illustrate how those calls are made. Program Efficiency is calculated using 2 separate measures…Fill rate (#10), and Majors to FTE BOR Appointed Faculty (#12). The following benchmarks are used to determine health for Fill Rate: Healthy:75 – 100% Cautionary:60 – 74% Unhealthy:< 60% An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 42

43 Determination of program’s health based on efficiency cont… The following benchmarks are used to determine health for Majors/FTE BOR Appointed Faculty : Healthy:15 - 35 Cautionary:30 – 60; 7 - 14 Unhealthy:61 +; 6 or fewer All programs are automatically calculated using the measure above unless they have an externally mandated (e.g. professional accreditation or licensing) capacity of less than 16 students per faculty. An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy Finally, average the 2 overall health scores for Class fill rate and Majors/FTE BOR Appointed Faculty then use the following rubric: 1.5 - 2 = Healthy 0.5 - 1= Cautionary 0= Unhealthy 43

44 #17Successful Completion (Equivalent C or higher) Percentage of students actively enrolled in program classes at Fall and Spring census who at end of semester (EOS) have earned a grade equivalent to C or higher. 44 #18Withdrawals (grade = W) Number of students actively enrolled (at this point have not withdrawn) at Fall and Spring census who at end of semester have a grade of W.

45 #19Persistence Fall to Spring Count of students who are majors in program at fall census (from Fall semester #3) and at subsequent Spring semester census are enrolled and are still majors in the program. Removed from the count (both numerator and denominator) are program major students to whom a program degree (or CA if highest credential awarded) has been conferred in the reporting Fall semester. Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 =.6774 or 67.74% 45

46 #19aPersistence Fall to Fall Count of students who are majors in program at fall census (from Fall semester #3) and at subsequent Fall semester census are enrolled and are still majors in the program. Removed from the count (both numerator and denominator) are program major students to whom a program degree (or CA if highest credential awarded) has been conferred in the first Fall, Spring or Summer reporting term. Example: 31 majors start in Fall 11 majors of the original 31 persist into Fall of the next year 11/31 =.3548 or 35.48% 46

47 #20Unduplicated Degrees/Certificates Awarded Unduplicated headcount of students in the fiscal year reported to whom a program degree or any certificate has been conferred. (Sum of 20a, 20b, 20c, and 20d). Uses most recent available freeze of fiscal year data. 47

48 #20aDegrees Awarded Degrees conferred in the FISCAL_YEAR_IRO. The count is of degrees and may show duplicate degrees received in the program by the same student if the program offers more than one degree. Uses most recent available freeze of fiscal year data. FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012 ‐ 2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters…” “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012 ‐ 2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters…” 48

49 #20bCertificates of Achievement Awarded Certificates of achievement conferred in the FISCAL_YEAR_IRO. The count is of program certificates of achievement and may show multiple certificates of achievement in the same program received by the same student. Uses most recent available freeze of fiscal year data. FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012 ‐ 2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters…” “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2013 indicates the fiscal year 2012 ‐ 2013 (July 1, 2012 to June 30, 2013) which includes Summer 2012, Fall 2012, and Spring 2013 semesters…” 49

50 #20cAdvanced Professional Certificates Awarded The count is of program Advanced Professional Certificates and may show multiple Advanced Professional Certificates in the same program received by the student. Uses most recent available freeze of fiscal year data. 50 #20d Other Certificates Awarded The count is of other program certificates and will show multiples received by the same student. Uses most recent available freeze of fiscal year data.

51 #21 External Licensing Exams Passed These are data to be entered by the college (the same information currently collected on the Graduates and Leavers surveys). Note: Not all CTE programs have external licensing exams. This value automatically calculates when you click on the “External” tab within the online web submission tool. Just click the edit button for your program, then enter the number sitting for the exam and the number that passed. Then just click on the button, “Save External Data.” 51

52 #22Transfers to UH 4-yr The number of students who are home ‐ institution at a UH System 4 ‐ yr institution for the first time in Fall who were in the reporting program prior to that Fall. In the event that a student has more than one major at the college, each program/major receives the count. An individual student may be counted in more than one program. UH Maui College is included when students transfer from any UHCC program to a UH Maui College four year programs. Number based on Fall semester only. 52

53 #22aTransfers with credential from program Students included in #22 who have received a degree from the community college program prior to transfer. Does not include any certificates. Number based on Fall semester only. 53 #22bTransfers without credential from program Students included in #22 who did not receive a degree from the community college program prior to transfer. Number based on Fall semester only.

54 Determination of program’s health based on effectiveness This year the system office will calculate and report health calls for all instructional programs using academic year 2013 data. The following instructions illustrate how those calls are made. Program Effectiveness is calculated using 3 separate measures: Unduplicated Degrees/Certificates Awarded (#20) / Majors (#3), Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2), and Persistence Fall to Spring (#19). The following benchmarks are used to determine health for Unduplicated Degrees/Certificates Awarded per major: Healthy:> 20% Cautionary:15 - 20% Unhealthy:< 15% An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy 54

55 Determination of program’s health based on effectiveness cont… The second measure used to determine health is Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2). The following benchmarks are used to for this measure: Healthy:.75 – 1.5 Cautionary:.25 -.75 and 1.5 – 3.0 Unhealthy: 3.0 An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 55

56 Determination of program’s health based on effectiveness cont… The third measure used to determine health is Persistence (Fall to Spring) (#19). The following benchmarks are used for this measure: Healthy:75 - 100% Cautionary:60 - 74% Unhealthy:< 60% An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 56

57 Determination of program’s health based on effectiveness cont… You should now have a value of zero, one, or two for each of the 3 effectiveness measures. The process of determining the Effectiveness health call score contains the following 3 steps: Step #1: Add up all 3 Overall Category Health scores for the effectiveness measures. (the zero’s, one’s and two's you assigned earlier) Step #2: Determine the effectiveness category health call range where: 5 - 6 = Healthy (=2) 2 - 4 = Cautionary (=1) 0 - 1 = Unhealthy (=0) Step #3: Now use the scoring rubric below to determine the effectiveness health call score: (for example: you had a healthy 5 in the previous step you would assign it a healthy 2 here) 2 = Healthy 1 = Cautionary 0 = Unhealthy 57

58 Determination of program’s overall health You should now have a value of zero, one, or two for each of the 3 program health calls; Demand, Efficiency, and Effectiveness. Simply add those 3 values together and use the Scoring Range Rubric below to determine the overall health of your program. 5 - 6 = Healthy 2 - 4 = Cautionary 0 - 1 = Unhealthy 58

59 #23Number of Distance Education Classes Taught Measures the number of classes taught with the mode of delivery as “Distance Completely Online.” In setting up the class, the college indicates the method of instruction used by the instructor in conducting the class. If the method is Distance Education, and the college indicates the “mode” of distance delivery as “Distance Completely Online” the class will be included in this count. 59 #24Enrollment Distance Education Classes At the Fall and Spring census, the number of students actively enrolled in all classes owned by the program and identified as Distance Completely On ‐ Line (#23). Does not include students who at Census have already withdrawn from the class. The number is an unduplicated count of registrations but is a duplicated count of students. (e.g. If a student is enrolled in two DCO classes both are included in this count.)

60 #25(DE) Fill Rate Total active student registrations in program distance education classes (#24) (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Excludes students who at Census have already withdrawn from the class. 60 #26(DE) Successful Completion (Equivalent C or higher) Percentage of students enrolled in program Distance Education classes (#24) at Fall and Spring census who at end of semester have earned a grade equivalent to C or higher. #27 (DE) Withdrawals (Grade=W) Number of students actively enrolled in program Distance Education classes (#24) at Fall and Spring census who at end of semester have a grade of W.

61 #28 (DE) Persistence (Fall to Spring not limited to Distance Ed) Students enrolled in program distance education classes at Fall census who at subsequent Spring semester census are enrolled in the college. Not limited to students continuing to take program distance education classes. This measure provides college ‐ level data to inform the question, “do DE students have lower persistence than students enrolled in non ‐ DE classes?” Note: Distance education classes may not be offered in the Fall. Example: 31 students enrolled in DE classes in Fall 21 students of the original 31 students persist into any Spring class 21/31 =.6774 or 67.74% 61

62 About Perkins Core Indicators Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). This means that the Perkins data is actually as of the 2011-2012 academic year. Data show State goal and College actual. 62

63 #29Perkins Core Indicator: Technical Skills Attainment (1P1) Technical Skills Attainment is calculated by: Number of concentrators who have a cumulative GPA> or = 2.00 in Career and Technical Education courses and who have stopped program participation in the year reported. Number of concentrators who have a cumulative GPA> or = 2.00 in Career and Technical Education courses and who have stopped program participation in the year reported. Number of concentrators who have stopped program participation in the year reported. 63

64 #30Perkins Core Indicator: Completion (2P1) Completion is calculated by: Number of concentrators who received a degree or certificate in a Career and Technical Education program and who have stopped program participation in the year reported. Number of concentrators who received a degree or certificate in a Career and Technical Education program and who have stopped program participation in the year reported. Number of concentrators who have stopped program participation in the year reported. 64

65 #31Perkins Core Indicator: Student Retention or Transfer (3P1) Student Retention or Transfer is calculated by: Number of concentrators in the year reported who have not completed a program and who continue postsecondary enrollment or who have transferred to a baccalaureate degree program. Number of concentrators in the year reported who have not completed a program. 65

66 #32Perkins Core Indicator: Student Placement (4P1) Student Placement is calculated by: Number of concentrators in the year reported (previous Perkins year) who have stopped program participation and who are placed or retained in employment, military service, or an apprenticeship program within unemployment insurance quarter following program completion. Number of concentrators in the year reported (previous Perkins year) who have stopped program participation 66

67 #33Perkins Core Indicator: Nontraditional Participation (5P1) Nontraditional Participation is calculated by: Number of participants from underrepresented gender groups who participated in a program that leads to employment in nontraditional fields. Number of participants from underrepresented gender groups who participated in a program that leads to employment in nontraditional fields. Number of participants who participated in a program that leads to employment in nontraditional fields. 67

68 #34Perkins Core Indicator: Non-Traditional Completion (5P2) Non-Traditional Completion is calculated by: Number of concentrators from underrepresented gender groups who received a degree or certificate in a program that leads to employment in nontraditional fields. Number of concentrators from underrepresented gender groups who received a degree or certificate in a program that leads to employment in nontraditional fields. Number of concentrators who received a degree or certificate in a program that leads to employment in nontraditional fields. 68

69 #35Performance Funding: Number of Degrees and Certificates Number of Degrees, Certificate of Achievement and, Advance Professional Certificates in the program conferred in the fiscal year. This is a count of credentials, not students. 69 #36Performance Funding: Number of Degrees and Certificates Native Hawaiian Number of Degrees, Certificate of Achievement and, Advance Professional Certificates in the program conferred in the fiscal year to Native Hawaiians. This is a count of credentials, not students.

70 #37Performance Funding: Number of Degrees and Certificates STEM If the CTE program is designated as a Science, Technology, Engineering, or Math (STEM) program, the number of degrees and certificates of achievement (or higher) in the program conferred in the fiscal year will be counted. This is a count of credentials, not students. If this is not a STEM program, the cell in the ARPD datasheet will display “Not STEM”. List of program majors identified to be STEM can be located at: STEM Majors STEM Majors 70

71 #39Performance Funding: Number of Transfers to UH 4-yr The number of students who are home ‐ institution at a UH System 4 ‐ yr institution for the first time in Fall whom were in the reporting program prior to that Fall. UH Maui College is included when students transfer from any UHCC program to a UH Maui College four year program. Number based on Fall semester only. Note: At the college level, an individual student might be counted in more than one program in the event the student has had more than one major at the college. Each program/major receives the count. Therefore adding transfers program by program will exceed the unduplicated college ‐ level transfer number reported in the Strategic Plan. 71 #38Performance Funding: Number of PELL Recipients The number of majors who received a Pell Grant in the Academic Year beginning Fall.

72 Comprehensive & Annual Program Review Timeline Develop training materials for Instructional CTE programs/ Data Validation Provide Program Review Training to CTE Instructional Programs Publish all Comprehensive CTE Program Reviews to Program-Unit Review Website All Instructional CTE program reviews due to your DC and/or UHCC ARPD website by EOB Wednesday November 27th. Follow up on last year’s list of suggested improvements for completion Review instructional CTE glossary and health call scoring rubric and provide suggestions for improvements to system office Plan this year’s program review based on suggested improvements from last year’s review Evaluate ARPD online web submission site for functionality and report bugs to system office Administer PR Process Improvement Sessions (Develop questionnaire, schedule sessions, collect feedback) Work with Process Improvement Team to determine what feedback needs to be taken back to UHCC IPRC Develop/update documentation for website needed for program review Sept OctAug-Sept Jan 27th Dec 20th Jan 25th Sept Aug Nov 27Sept-Oct 72 Send Campus-wide update on program review process with link to data and due date Sept Summarize PR Process Improvement Feedback, communicate results to groups, publish to web. Jan 5th Nov 28 VCAA collects all annual Unit reviews and forwards on to IR to submit to system office as required.

73 Questions? The intention of this presentation was to provide a single source for all of the documentation related to the Annual Reports for Program Data —specifically for Instructional CTE programs. All of the documents you should need for your ARPD review have been linked directly into this presentation. As always, your feedback regarding this process is essential in making improvements year over year. Please take a moment to provide feedback at the end of this process. Those wishing to print this presentation without color may set their printer settings to print in “grayscale,” and set to print in portrait orientation, if desired. If you need more information on this process please feel free to contact me: HawCC Institutional Analyst Shawn Flood934-2648 HawCC Institutional Analyst Shawn Flood934-2648Shawn FloodShawn FloodMahalo! 73

74 Annual Report of Program Data SLOs Assessment If you need assistance with the P-SLO Tab information please contact: James Kiley Institutional Assessment Coordinator Email: kileyj@hawaii.edu kileyj@hawaii.edu Phone: 934-2649 Mahalo! 74


Download ppt "AY 2013 CTE Instructional Program Review Process Data Description Process – Timeline Rev. 10-18-13."

Similar presentations


Ads by Google