Measuring Nontraditional Participation

Slides:



Advertisements
Similar presentations
Partnerships for STEM Exposure for Every Student
Advertisements

Association for Career and Technical Education 1 Changes and Implications of the Carl D. Perkins Career and Technical Education Improvement Act of 2006.
Perkins IV National Definitions and State Reporting: The Impact on Data Collection in Texas Gabriela Borcoman Texas Higher Education Coordinating Board.
Criteria for High Quality Career and Technical Education Programs National Career Pathways Network Orlando, FL November 14, 2014.
Career and Technical Education in Minnesota Presentation to the Governor’s Workforce Development Council March 13, 2008 Minnesota Perkins State Career.
IL State Board of Education - 9/18/2007 Perkins IV - Secondary Indicators Carol Brooks Illinois State Board of Education.
Slide 1 Perkins IV Changes and Additions to Perkins Data, Tables and Reports August, 2009 Michele Dorschner, System Developer, Office of the Chancellor,
DQI State Plan Accountability Requirements, Guidelines, Timeline, Student Definitions and Indicators John Haigh, U.S. Department of Education Savannah,
Beyond the Brochure: Strategies to Improve Nontraditional Core Performance Indicators Office of Career and Technical Education February 6, 2012.
PERKINS NEW COORDINATOR TRAINING SEPTEMBER 8, 2010 PERKINS 101.
Standardization: Why Now? A Post-Secondary Perspective From Minnesota Pradeep Kotamraju Ph.D. System Director, Perkins Minnesota State Colleges and Universities.
Carl D. Perkins Career and Technical Education Act of 2006 “…will allow students … to get a vision of what can be achieved, what they can do in technical.
Perkins 202 Dr. Michelle Crary – Staff Development and Accountability Coordinator Nori Cannell – Director – Guidance & Career and Technical Education.
Carl D. Perkins Career and Technical Education Improvement Act of 2006.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Using CTE/Perkins Data to drive Program Improvement Program.
Kathy WilkinsLes Janis Montana University SystemGeorgia State University NACTEI Conference May 12, 2011 The proposed cut of 13 percent from Perkins Title.
Barrier Busters! Mimi Lufkin National Alliance for Partnerships in Equity.
Perkins Rates as Tsunami Buoys Developing Early Warning Systems for LEA Accountability Melvin D’Souza, Ph.D. CTE Data Analyst Delaware Dept. of Education.
CARL D. PERKINS SPRING INFORMATIONAL SESSION for NEW PERKINS COORDINATORS TUESDAY, MAY 6, 2003 OHIO BOARD OF REGENTS MAIN CONFERENCE ROOM NEW PERKINS COORDINATORS.
Preparing Underrepresented Students for Success in Non-traditional Occupations Interpreting and Using State Perkins Data for Program Improvement Steve.
PROGRAMS OF STUDY NONTRADITIONAL CAREERS AND THE FUTURE OF CTE MIMI LUFKIN CHIEF EXECUTIVE OFFICER NATIONAL ALLIANCE FOR PARTNERSHIPS IN EQUITY An Equity.
Equity Update Mimi Lufkin, Executive Director National Alliance for Partnerships in Equity NACTEI National Conference Friday, May 19, 2006.
Program Improvement Process for Equity in STEM (PIPESTEM) Mimi Lufkin, Chief Executive Officer Career and Technical Education Equity Council Conference.
U.S. Department of Education Office of Vocational and Adult Education Division of Academic and Technical Education Progress of the State Perkins Accountability.
Click to edit Master title style 1 What Data? California Community Colleges Chancellor’s Office Dr. W. Charles Wiseley, “Chuck” Career Technical Education.
Perkins IV: The Special Populations Perspective Mimi Lufkin, CEO CCCAOE Conference October 22, 2008 San Diego, CA National Alliance for Partnerships in.
Mimi Lufkin Claudia Morrell National Alliance for Partnerships in Equity.
Aggregated Core Indicator Data TOPs Code 12 – Performance Of All Voc Ed & Special Populations Students.
Click to edit Master title style 1 Foundations for Perkins Accountability: Core Indicators, Annual Reports, Targets, and Gap Evaluation Carl D Perkins.
Dale Gruis Ag Ed Consultant Iowa Dept. of Education
Perkins IV – What ’ s In Store? Mimi Lufkin NAPE/Women Work National Conference April 6, 2008 National Alliance for Partnerships in Equity.
Key Considerations in Collecting Student Follow-up Data NACTEI May 15, 2012 Portland, OR Promoting Rigorous Career and Technical Education Programs of.
Illinois Community College BoardIllinois State Board of Education Programs of Study Self-Assessment: Starting the Journey on the Right Foot February 4,
Administering Perkins Grants. PI-1303-F Carl Perkins Formula Allocation Application (Single) 1PI-1303-FSection IXNon-Compliance with Core Indicators and.
Perkins 101 Dale Gruis, Consultant Iowa Dept. of Education
The Perkins Act Performance Measures for Nontraditional CTE -May 17, 2012 National Coalition for Women and Girls in Education National Coalition on Women,
OPERATIONALIZING PERKINS IV ACCOUNTABILITY DRAFT.
Perkins A discussion of Program Improvement from Perkins to Perkins III Presented by: Tom Grimm, Iowa Dept. of Education NACTEI Conference, Palm.
CTE by the Numbers CTE data entry, verification and reporting Counselor & Administrator Conference Dover, Delaware January 26, 2016.
Increasing the Participation and Completion of Women in Project Lead the Way Mimi Lufkin National Alliance for Partnerships in Equity National Alliance.
Mimi Lufkin Chief Executive Officer National Alliance for Partnerships in Equity Education Foundation The Five Step Program Improvement Process Step One:
Kathy WilkinsLes Janis MontanaGeorgia NACTEI Conference May 12-14, 2009 High demand/wage/skill occupations (High-DWS) are mentioned 23 times in the Perkins.
Perkins Core Indicators of Performance
A Brief Look at Career and Technical Education NCCCS - Perkins Update
David Q. Moreno, CTE Director
Perkins IV Data and Accountability
2016 Taft College Student Success Scorecard
ACE Colorado’s High-Risk / Special Populations
Download this presentation: bit.ly/gfsf-gp
Secondary Perkins Data
Student Success Scorecard & Other Institutional Effectiveness Metrics
Perkins IV Postsecondary Accountability
Career Technical Education & Every Student Succeeds Act
Quality CTE Standards-aligned and Integrated Curriculum
TECH PREP PERFORMANCE MEASURES & PROGRAMS OF STUDY
Perkins IV Secondary Accountability
Connecting TANF to Career Pathways with HPOG
DQI Atlanta, June 21, 2006 Jim Halloran
Super Powered Curriculum Roadshow
CTE & YTP YTP Fall Regional Meeting
CTE & YTP YTP Fall Regional Meeting
CALIFORNIA COMMUNITY COLLEGES in Central/Mother Lode Region
Strengthening CTE for the 21st Century Act
Equity Gap Analysis Tool Considerations for Local Planning
Perkins 101 Alisha Hyslop, ACTE.
Perkins Core Indicators of Performance Report
Strengthening Secondary Indicators under Perkins V
Data Quality Institute Washington DC February 8-10, 2006
Collecting Data for Career Clusters
Arkansas Perkins V Stakeholder Meetings 2019
Presentation transcript:

Measuring Nontraditional Participation Mimi Lufkin National Alliance for Partnerships in Equity Data Quality Institute February 8-10, 2006

Legislative History Gender equity provisions in Perkins 1976 Amendments Full-time Gender Equity Coordinator- $50,000 1984 Perkins Act Full-time Gender Equity Coordinator- $60,000 Set-asides 3.5% Gender Equity, 8.5% SP/DH 1990 Perkins Act Set-asides 3% Gender Equity, 7% SP/DH, .5% either 1998 Perkins Act Accountability and other provisions

Perkins Accountability Measures “Student participation in and completion of vocational and technical education programs that lead to nontraditional training and employment” 4s1 and 4p1 Participation 4s2 and 4p2 Completion

Nontraditional Training and Employment “Occupations or fields of work, including careers in computer science, technology, and other emerging high skill occupations, for which individuals from one gender comprise less than 25 percent of the individuals employed in each such occupation or field of work.”

Current Participation Measure Numerator: # of students in underrepresented gender groups who participated in a non-traditional program in the reporting year. Denominator: # of students who participated in a non-traditional program in the reporting year.

Defining Participation Currently participation is defined as enrolled in a program that has been identified as nontraditional Reauthorization provides opportunity to redefine participation Enrollment? Concentration?

Enrollment Pros All states define enrollment the same way, almost! A measure of the social and institutional barriers prior to course enrollment in nontraditional programs A measure of exploration opportunities The denominator data in this measure is not reported any other place in the CAR while the numerator data can also be found in the enrollment report. This data is also disaggregated by gender, race and special population It is easier to design and implement improvement strategies at the local level directed at enrolling students. Moving to a concentrator means you are only successful with students taking several courses

Enrollment Cons Students enroll in multiple intro level courses. Where do they get assigned? Many programs may share introductory courses Not a measure of institutional barriers while participating and doesn’t alert the institution for early intervention Measures those “looking” but not necessarily committed Eliminates the ability to evaluate the relationship of program “participation” to completions at a detailed level and introduces a new cohort that may not be comparable to the exiting cohort or have the same event history (i.e., fee changes, social crisis, etc.) Provides unclear participation rates due to data quality issues as well as determining actual student intent

Concentration Pros Measure of those actually committed Reflective of retention and captures institutional barriers while participating All other measures based on concentrators allowing comparison of similar cohorts Enrollment data available in CAR enrollment report but only disaggregated by gender within nontraditional With standardization of the concentrator definition all states will define concentrator the same way

Concentration Cons Provide lower participation rates than enrollment Doesn’t provide information on schools success in motivating underrepresented students to try programs Little difference between concentrator and completer in some states definitions. Barriers may be before concentration threshold Enrollment data of all students in nontraditional programs not reported anywhere in CAR. Cannot recreate enrollment measure from enrollment data Cannot compare to 2s/p1 because denominator is NTO concentrators, not all concentrators in NTO programs Currently concentrator is defined differently from state to state

Overall Issues Only core indicator that measures an entry point (participation) into the CTE system. All other measures are exit measures. The real value of the measure for program improvement is when disaggregated by gender, race/ethnicity, special population status or by program, CIP or cluster. An alternative approach is to define a new threshold value for this indicator such as two courses at the secondary level and declaring a major or program enrollment at the post-secondary level. If we are measuring the effectiveness of schools at getting students to enter (participate) programs non-traditional for their gender where do we place the bar of success – at the enrollment level or at the higher concentrator level?