Presentation is loading. Please wait.

Presentation is loading. Please wait.

AY 2010 CTE Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev. 10-18-10.

Similar presentations


Presentation on theme: "AY 2010 CTE Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev. 10-18-10."— Presentation transcript:

1 AY 2010 CTE Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev. 10-18-10

2 Purpose The primary purpose of this presentation is to provide clarity to all instructional programs on how the data is calculated for both our local Comprehensive and the system required Annual Program Reviews. We have been asked to produce an annual program review for each and every one of our instructional programs and units. They are required of each system CC and will be taken to the U of H Board of Regents for their review. So who has to do a program review this year? If you are normally scheduled to do a comprehensive or are “Jumping” you will do a comprehensive review this year. Additionally, every instructional and non-instructional program will do an Annual Program Review this year. Not sure if you’re scheduled for a comprehensive review or not? Click here for the Program-Unit Review Schedule Program-Unit Review ScheduleProgram-Unit Review Schedule 2

3 What are we doing to improve our program review process? Based on the feedback we received from you last year from our program review process improvement focus groups, the following changes have been made: In the interest of continuous process improvement your suggestions for improving this process are posted on the Assessment Website and linked here for your convenience Program/Unit Process Improvement Summary Program/Unit Process Improvement Summary Over the course of the summer the new program review website was developed and most of the documentation was updated to reduce the amount of time it takes to get you the data you need for your review. Joni and Guy worked with programs to reevaluate SOC codes for programs. Your overall program budget this year will include the actual cost for salaries, the actual b-budget expenditures, and the cost to run your program taking grant dollars into consideration. Annual Instructional Program Review data, glossaries, and health call rubrics will be delivered entirely on-line, through an in-house developed program designed for UHCC’s. Programs completing comprehensive reviews this year will continue to receive their comprehensive review template with the imbedded and formatted data table. 3

4 What is different this year?  Five separate training sessions will be given this year in order to focus on specific groups. They are: –1.ATE Division Programs –2.Hospitality, Business Ed, and Nursing Division Programs –3.Liberal Arts Division Programs –4.Public Services –5. Units  This year the Liberal Arts program has it’s own program review and health call scoring rubric.  There will be 3 distinct training presentations to accommodate the new Liberal Arts review.  2 full years of data provided to programs—next year we will have 3.  No coversheets  No upload capabilities with web submission tool. Your analysis should be on the data provided.  The new “due to VCAA” date this year is November 15 th.  Your budget numbers will be made available to you this year on the Assessment website at AY 2010 Instructional Program Review Budget Table AY 2010 Instructional Program Review Budget Table  Annual reviews are completely on-line this year with the web submission tool. 4

5 What’s a Jumper? A Jumper is a locally defined term that is used to describe an instructional program or non-instructional program (unit) that has decided to jump out of their normally scheduled slot for their comprehensive program reviews and into this years cycle. Why would anyone want to do that? Jumping into this years comprehensive cycle means that you will have an opportunity to be considered for any budgetary decisions that will be made in this years budget process. Jumpers will still have to do their comprehensive review on their next scheduled review. Jumping does not affect the existing schedule—you are voluntarily doing an extra review to be considered in this budget cycle. 5

6 I belong to an Instructional Program… which form do I use? COMPREHENSIVE REVIEWS Your programs data table has been backfilled with Fall 2007 data, embedded and formatted into your the comprehensive program review template. You should have everything you need to begin writing your review. The comprehensive template has been emailed to you. Your programs data table has been backfilled with Fall 2007 data, embedded and formatted into your the comprehensive program review template. You should have everything you need to begin writing your review. The comprehensive template has been emailed to you. (Use this template ONLY if you are scheduled for a comprehensive program review this year or are jumping) --------------------------------------------------------------------------------------------------------------- ANNUAL REVIEWS Your programs data table is available on-line using the link below. You should have everything you need to begin writing your review within the web submission tool. Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. Your programs data table is available on-line using the link below. You should have everything you need to begin writing your review within the web submission tool. Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. UHCC Annual Report of Program Data Web Submission Tool UHCC Annual Report of Program Data Web Submission Tool (ALL Instructional programs will need to complete this—even if you’re completing a comprehensive review) 6

7 Terminology / Timing For AY 2009 the data is pulled as of Summer 08, Fall 08, and Spring 09 terms. For AY 2010 the data is pulled as of Summer 09, Fall 09, and Spring 10 terms. The Census freeze event is the fifth Friday after the first day of instruction. The End of semester freeze event is 10 weeks after the last day of instruction. FY stands for Fiscal Year. The UH Fiscal year runs from July 1 st to June 30 th. So, for example, if you are looking at the number of degrees in your program for the period FY 2010, that would include all degrees conferred between July 1 st 2009 to June 30 th 2010. 7

8 Instructional Program Review Data Elements For those programs doing a comprehensive review or jumping this year, Fall 07 data was copied directly in from last years program review. This should be the last year we see any fall data, as next year we will have 3 full academic years of data available. The system office reported your AY 2009 data using today’s routines, which may vary from what was reported to you last year. This is due to the system office using improved routines over the ones used last year. Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting the data from the same place and at the same snapshot in time (this is a good thing). The following slides will explain in detail what data has been provided to you for your comprehensive-annual instructional program review write ups and how it has been calculated. 8

9 #1New and Replacement Positions (State) Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at state level. Compiles data based on Standard Occupational Classification (SOC) codes that the college has linked to the instructional program. Data based on annual new/replacement positions projections as of Spring 2010 State position numbers are not pro ‐ rated. From their website, “…EMSI specializes in reports that analyze and quantify the total economic benefits of community and technical colleges in their region, and also creates data-driven strategic planning tools that help colleges maximize their impact through labor market responsiveness…” 9

10 #2 New and Replacement Positions (County prorated) Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at county level. Compiles data based on Standard Occupational Classification (SOC) codes that the college has linked to the instructional program. Note: It is possible for the number of new and replacement positions in the county to be higher than the state if the projection in other counties is for a loss of new and replacement positions. County data pro ‐ rated to reflect number of programs aligned to the SOC code and weighted by number of majors in each program/institution. Data based on annual new/replacement positions projections as of Spring 2010. 10

11 #3Number of Majors Count of program majors who are home ‐ institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. This is an annual number. Programs receive a count of.5 for each term within the academic year that the student is a major. A maximum count of 1.0 (one) for each student. 11

12 #4SSH Program majors in Program Classes The sum of Fall and Spring SSH taken by program majors in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, Summer SSH are included. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program. Not sure what your program classes are? Click here to find out: Program Course Listing Program Course Listing 12

13 #5SSH Non-Majors in Program Classes The sum of Fall and Spring SSH taken by non ‐ program majors (not counted in #4) in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, Summer SSH are included Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 13

14 #6SSH in All Program Classes The sum of Fall and Spring SSH taken by all students in classes linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, Summer SSH are included. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 14

15 #7 FTE Enrollment in Program Classes Sum of Student Semester Hours (SSH) taken by all students in classes linked to the program (#6) divided by 30. Undergraduate, lower division Full Time Equivalent (FTE) is calculated as 15 credits per term. Captured at Census and excludes students who have already withdrawn (W) at this point. Note: for programs where year ‐ round attendance is mandatory, summer SSH are included. 15

16 #8Total Number of Classes Taught Total number of classes taught in Fall and Spring that are linked to the program. Includes Summer classes if year ‐ round attendance is mandatory. Concurrent and Cross listed classes are only counted once for the primary class. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 16

17 Determination of program’s health based on demand This year the system office will calculate and report health calls for all instructional programs using academic year 2010 data. The following instructions illustrate how those calls are made. Program Demand is determined by taking the number of majors (#3) and dividing them by the number of New and Replacement Positions by County (#2). The following benchmarks are used to determine demand health: Healthy:1.5 - 4.0 Cautionary:.5 – 1.49; 4.1 – 5.0 Unhealthy: 5.0 Finally, an Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy 17

18 #9Average Class Size Total number of students actively registered in Fall and Spring program classes divided by classes taught (#8). Does not include students who have already withdrawn from the class by Census. Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program. 18

19 #10Fill Rate Total active student registrations in program classes (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Captured at Census and excludes students who have already withdrawn (W) at this point. 19

20 #11FTE BOR Appointed Faculty Sum appointments (1.0, 0.5, etc.) of all BOR appointed program faculty (excludes lecturers and other non BOR appointees). Uses the “hiring status” of the faculty member – not the teaching/work load. Uses the Employing Agency Code (EAC) recorded in the Human Resources (HR) database to determine faculty’s program home. Data as of March 2010. Data provided by UH Human Resources Office. Click here to find out who the BOR Appointed Program Faculty are for your program: Instruction Program BOR Appointments Instruction Program BOR Appointments 20

21 #12 Majors to FTE BOR Appointed Faculty Number of majors (#3) divided by sum appointments (#11) (1.0, 0.5, etc.) of all BOR appointed program faculty. Data show the number of student majors in the program for each faculty member (25 majors to 1 faculty shown as “25”) 21

22 #13Majors to Analytic FTE Faculty Number of majors (#3) divided by number of Analytic FTE faculty (13a). 22

23 #13aAnalytical FTE Faculty (Workload) Calculated by sum of Semester Hours (not Student Semester Hours) taught in program classes divided by 27. Analytic FTE is useful as a comparison to FTE of BOR appointed faculty (#11). Used for analysis of program offerings covered by lecturers. 23

24 Where are my budget numbers? This year, a decision was made to use the actual expenditures of both B-Budgets and salaries, as part of your overall budget. The budget numbers you need for your program review are now available on the Assessment website at AY 2010 Instructional Program Review Budget Table AY 2010 Instructional Program Review Budget Table Programs doing comprehensive reviews will need to first enter the budget information into the annual web submission tool, then plug those numbers into your data sheets, which have been embedded and formatted into your newly approved template. 24

25 How do I get the budget numbers into the web submission tool? Log in, using your UH username and password. Click on the “Enter web submission” button. Select the “Cost per SSH” tab. Click the “Edit” button for your program. Go to the link on the assessment website for your budget and… Enter the value of general funds. (this is b-budget plus salary) Enter the value of federal funds. Enter any other funds that you wish to add. Click the “Save Cost per SSH Data” button at bottom of page. Your overall budget allocation (sum of everything you entered) and the cost per SSH will automatically populate the data table in your review. For those programs doing a comprehensive review, this is where you will get your budget numbers. Note: There is no way to update your budget for AY 2009 this year. 25

26 #14Overall Program Budget Allocation The overall program budget allocation = General Funded Budget Allocations (14a) + Special/Federal Budget Allocations (14b) 26

27 #14aGeneral Funded Budget Allocation The general funded budget allocation = actual personnel costs + b budget expenditures Personnel costs this year include the salaries for : faculty, lecturers, overload, APT, student help, and clerical. 27

28 #14bSpecial/Federal Budget Allocation The expenditure of dollars from Federal grants 28

29 #15Cost per SSH Program Budget Allocation (#14) divided by SSH in program classes (#6) 29

30 #16Number of Low Enrolled (<10) Classes Classes taught (#8) with 9 or fewer active students at Census. Excludes students who have already withdrawn (W) at this point. Excludes Directed Studies (99 series). Includes Cooperative Education (93 series) as there is a resource cost to the program. 30

31 Determination of program’s health based on efficiency This year the system office will calculate and report health calls for all instructional programs using AY 2010 data. The following instructions illustrate how those calls are made. Program Efficiency is calculated using 2 separate measures…Fill rate (#10), and Majors to FTE BOR Appointed Faculty (#12). The following benchmarks are used to determine health for Fill Rate: Healthy:75 – 100% Cautionary:60 – 74% Unhealthy:< 60% An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 31

32 Determination of program’s health based on efficiency cont… The following benchmarks are used to determine health for Majors/FTE BOR Appointed Faculty : Healthy:15 - 35 Cautionary:30 – 60; 7 - 14 Unhealthy:61 +; 6 or fewer An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy Finally, average the 2 overall health scores for Class fill rate and Majors/FTE BOR Appointed Faculty then use the following rubric: 1.5 - 2 = Healthy.5 - 1= Cautionary 0= Unhealthy 32

33 #17Successful Completion (Equivalent C or higher) Percentage of students actively enrolled in program classes at Fall and Spring census who at end of semester have earned a grade equivalent to C or higher. 33

34 #18Withdrawals (grade = W) Number of students actively enrolled (at this point have not withdrawn) at Fall and Spring census who at end of semester have a grade of W. 34

35 #19Persistence Fall to Spring Count of students who are majors in program at fall census (from Fall semester #3) and at subsequent Spring semester census are enrolled and are still majors in the program. Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 =.6774 or 67.74% 35

36 #20Unduplicated Degrees/Certificates Awarded Unduplicated headcount of students in the fiscal year reported to whom a program degree or any certificate has been conferred. (Sum of 20a, 20b, 20c, and 20d). Uses most recent available freeze of fiscal year data. For ARPD year 2009, the most recent fiscal data on August 15, 2009, was from FY 2008. For ARPD year 2010, the most recent fiscal data on August 15, 2010, was from FY 2010. 36

37 #20aNumber of Degrees Awarded Degrees conferred in the FISCAL_YEAR_IRO. The count is of degrees and may shows duplicate degrees received in the program by the same student if the program offers more than one degree. Uses most recent available freeze of fiscal year data. Note: for ARPD year 2009, the most recent fiscal data on August 15, 2009, was from FY 2008. For ARPD year 2010, the most recent fiscal data on August15, 2010, was from FY 2010. For ARPD year 2010, the most recent fiscal data on August15, 2010, was from FY 2010.FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004 ‐ 2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…” “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004 ‐ 2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…” 37

38 #20bCertificates of Achievement Awarded Certificates of achievement conferred in the FISCAL_YEAR_IRO. The count is of program certificates of achievement and may show multiple certificates of achievement in the same program received by the same student. Uses most recent available freeze of fiscal year data. FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004 ‐ 2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…” “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004 ‐ 2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…” 38

39 #20cAcademic Subject Certificates Awarded The count is of program Academic Subject Certificates and may show multiple Academic Subject Certificates in the same program received by the student. Uses most recent available freeze of fiscal year data. 39

40 #20d Other Certificates Awarded The count is of other program certificates (such as APC) and will show multiples received by the same student. Uses most recent available freeze of fiscal year data. 40

41 #21Transfers to UH 4-yr programs Students with home campus UH Manoa, UH Hilo, or UH West Oahu for the first time Fall 2009 who prior to Fall 2009 had UH community college as home campus. Also includes students who for the first time in Fall 2009 show Maui CC, Applied Business Information Technology as home campus and major. This is a program measure. A student is included in the count of program transfers in as many programs in which they have been a major at the college. 41

42 #21aTransfers with credential from program Students included in #21 who have received a degree from the community college program prior to transfer. Does not include any certificates. 42

43 #21bTransfers without credential from program Students included in #21 who did not receive a degree from the community college program prior to transfer. 43

44 Determination of program’s health based on effectiveness This year the system office will calculate and report health calls for all instructional programs using academic year 2010 data. The following instructions illustrate how those calls are made. Program Effectiveness is calculated using 3 separate measures: Unduplicated Degrees/Certificates Awarded (#20) / Majors (#3), Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2), and Persistence Fall to Spring (#19). The following benchmarks are used to determine health for Unduplicated Degrees/Certificates Awarded per major: Healthy:> 20% Cautionary:15 - 20% Unhealthy:< 15% An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy 44

45 Determination of program’s health based on effectiveness cont… The second measure used to determine health is Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2). The following benchmarks are used to for this measure: Healthy:.75 – 1.5 Cautionary:.25 -.75 and 1.5 – 3.0 Unhealthy: 3.0 An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 45

46 Determination of program’s health based on effectiveness cont… The third measure used to determine health is Persistence (Fall to Spring) (#19). The following benchmarks are used for this measure: Healthy:75 - 100% Cautionary:60 - 74% Unhealthy:< 60% An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy 46

47 Determination of program’s health based on effectiveness cont… You should now have a value of zero, one, or two for each of the 3 effectiveness measures. The process of determining the Effectiveness health call score contains the following 3 steps: Step #1: Add up all 3 Overall Category Health scores for the effectiveness measures. (the zero’s, one’s and two's you assigned earlier) Step #2: Determine the effectiveness category health call range where: 5 - 6 = Healthy 2 - 4 = Cautionary 0 - 1 = Unhealthy Step #3: Now use the scoring rubric below to determine the effectiveness health call score: (for example: you had a healthy 5 in the previous step you would assign it a healthy 2 here) 2 = Healthy 1 = Cautionary 0 = Unhealthy 47

48 Determination of program’s overall health You should now have a value of zero, one, or two for each of the 3 program health calls; Demand, Efficiency, and Effectiveness. Simply add those 3 values together and use the Scoring Range Rubric below to determine the overall health of your program. 5 - 6 = Healthy 2 - 4 = Cautionary 0 - 1 = Unhealthy 48

49 #22Number of Distance Education Classes Taught Measures the number of classes taught with the mode of delivery as “Distance Completely Online.” In setting up the class, the college indicates the method of instruction used by the instructor in conducting the class. If the method is Distance Education, and the college indicates the “mode” of distance delivery as “Distance Completely Online” the class will be included in this count. 49

50 #23Enrollment Distance Education Classes At the Fall and Spring census, the number of students actively enrolled in all classes owned by the program and identified as Distance Completely On ‐ Line (#22). Does not include students who at Census have already withdrawn from the class. 50

51 #24Fill Rate Total active student registrations in program distance education classes (#23) classes (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). Does not include students who at Census have already withdrawn from the class. 51

52 #25Successful Completion (Equivalent C or higher) Percentage of students enrolled in program Distance Education classes (#23) at Fall and Spring census who at end of semester have earned a grade equivalent to C or higher. 52

53 #26Withdrawals (Grade=W) Number of students actively enrolled in program Distance Education classes (#23) at Fall and Spring census who at end of semester have a grade of W. 53

54 #27Persistence (Fall to Spring not limited to Distance Ed) Students enrolled in program distance education classes at Fall census who at subsequent Spring semester census are enrolled in the college. Not limited to students continuing to take program distance education classes. Example: 31 majors enrolled in DE classes in Fall 21 majors of the original 31 majors persist into any Spring class 21/31 =.6774 or 67.74% 54

55 #28Perkins Core Indicator: Technical Skills Attainment (1P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 2010 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data show State goal and College actual. Technical Skills Attainment is calculated by: Number of concentrators who have a cumulative GPA> or = 2.00 in Career and Technical Education courses and who have stopped program participation in the year reported. Number of concentrators who have a cumulative GPA> or = 2.00 in Career and Technical Education courses and who have stopped program participation in the year reported. Number of concentrators who have stopped program participation in the year reported. 55

56 #29Perkins Core Indicator: Completion (2P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 2010 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data show State goal and College actual. Completion is calculated by: Number of concentrators who received a degree or certificate in a Career and Technical Education program and who have stopped program participation in the year reported. Number of concentrators who received a degree or certificate in a Career and Technical Education program and who have stopped program participation in the year reported. Number of concentrators who have stopped program participation in the year reported. 56

57 #30Perkins Core Indicator: Student Retention or Transfer (3P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 2010 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data show State goal and College actual. Student Retention or Transfer is calculated by: Number of concentrators in the year reported who have not completed a program and who continue postsecondary enrollment or who have transferred to a baccalaureate degree program. Number of concentrators in the year reported who have not completed a program. 57

58 #31Perkins Core Indicator: Student Placement (4P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 2010 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data show State goal and College actual. Student Placement is calculated by: Number of concentrators in the year reported (previous Perkins year) who have stopped program participation and who are placed or retained in employment, military service, or an apprenticeship program within unemployment insurance quarter following program completion. Number of concentrators in the year reported (previous Perkins year) who have stopped program participation 58

59 #32Perkins Core Indicator: Nontraditional Participation (5P1) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 2010 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data show State goal and College actual. Nontraditional Participation is calculated by: Number of participants from underrepresented groups who participated in a program that leads to employment in nontraditional fields during the reporting year. Number of participants from underrepresented groups who participated in a program that leads to employment in nontraditional fields during the reporting year. Number of participants who participated in a program that leads to employment in nontraditional fields during the reporting year. 59

60 #33Perkins Core Indicator: Non-Traditional Completion (5P2) Perkins Core Indicators are used in the development of the HawCC Program Health Indicator (PHI) reports completed annually for CTE programs. The data that you see in your program review is for the 2010 academic year. All Perkins Core Indicator goals not met must be addressed in the narrative and action plan in your review. A concentrator is a student who has a major (taken from the major field in Banner) for a CTE program, and who has completed 12 or more credits by the end of the Perkins year. Uses Perkins data from prior year Perkins Consolidated Annual Report (CAR). Data show State goal and College actual. Non-Traditional Completion is calculated by: Number of concentrators from underrepresented gender groups who received a degree or certificate in a program that leads to employment in nontraditional fields during the reporting year. Number of concentrators from underrepresented gender groups who received a degree or certificate in a program that leads to employment in nontraditional fields during the reporting year. Number of concentrators who received a degree or certificate in a program that leads to employment in nontraditional fields during the reporting year. 60

61 Comprehensive & Annual Program Review Timeline Develop presentation and training materials (3 presentations and 5 training sessions) Provide Annual Program Review Training to Campus Package and Delivery of all Program & Unit Reviews to System Office Post all Program & Unit Reviews to Assessment Website Aug 26th All program and unit reviews (Comprehensive and Annual) due to Interim VCAA by web/email by EOB Monday November 15th. Work with Chancellor to determine best strategy for rolling out new on-line data submission tool. Collect and deliver data to Student Support Services Build out 2010 program review page on the assessment website Plan this year’s program review based on suggested improvements from last year’s review Create and post a Program-unit review process improvement summary document to capture year over year improvements to existing PR process Determine what templates will be used and that they are linked to a common website (UHCC). Review and edit all PR documentation and work with Trina and AC for approvals Determine what changes to our local program- unit review processes need to be made and plan for same. Develop and administer PR Process Improvement Survey Work with AC and Interim VCAA to determine what needs to be taken back to UHCC IPRC Receive 2 years data from system office, perform data validation, and begin building instructional comprehensive data tables Aug 15th Dec 15th Thru Oct Aug 28th Jul - Sept Jul 28th Jan 5th Dec 20th Jan 25th Jul - AugAug - Oct Jul - Sept Nov 15Sept 23rd Oct 23rd 61

62 AY2010 Comprehensive & Annual Review Process Step 1Write your instructional program review using the appropriate template or web submission tool. Step 2Send your documents (one Word document per review) to Interim VCAA Joni Onishi by email no later than end of business, Monday November 15th, 2010. Step 3Interim VCAA will ensure that all required documents have been received and that they are adequate. Step 4Interim VCAA will forward all approved reviews to the Institutional Research Office for further processing. Step 5The Annual reviews will be collected on-line by the System Office for review by the UH Board of Regents. Comprehensive reviews will be forwarded along as appropriate following CERC guidelines. Step 6All reviews will finally be converted to PDF and posted to the Assessment Web Site. 62

63 Questions? The intention of this presentation was to provide a single source for all of the documentation related to the Comprehensive & Annual Instructional CTE Program Review process. I have linked all of the documents you should need directly into this presentation. For additional information, the glossary and health call scoring rubric can be found at: UHCC Annual Report of Program Data Web Submission Tool UHCC Annual Report of Program Data Web Submission Tool If you need more information on this process please feel free to contact me: Shawn Flood974-7512 Shawn Flood974-7512Shawn FloodShawn FloodMahalo! 63


Download ppt "AY 2010 CTE Instructional Program Review Data Delivery Plan Data Description Process – Timeline Rev. 10-18-10."

Similar presentations


Ads by Google