Development of Statewide Community College Value- Added Accountability Measures Michael J. Keller Director of Policy Analysis and Research Maryland Higher.

Slides:



Advertisements
Similar presentations
FBOE K-20 Accountability Project CEPRI Workgroup June 13, 2002 Orlando, Florida.
Advertisements

 Objective  Policy review timeline  Overview of current admissions policy and recommendations  Overview of current remedial policy and recommendations.
UA Metrics University of Alaska Board of Regents Meeting December 6-7,
Criteria for High Quality Career and Technical Education Programs National Career Pathways Network Orlando, FL November 14, 2014.
Leading the Way : Access. Success. Impact. Board of Governors Summit August 9, 2013.
Measuring the Effects of Delayed or Avoided Developmental Coursework A Suggested Approach for Assessing the Effectiveness of Pre-College Courses T.M.
Illinois High School to College Success Report High School Feedback Reporting In Cooperation with ACT, Inc. Illinois Community College Board Illinois Board.
LA GRAD Act LCTCS Conference LA Board of Regents March
Performance Based Funding Formula. SSI History SSI Overview University Formula Performance Changes OTC Funding Formula 2.
SEM Planning Model.
Transfer Shock: Is It Alive and Well? Dr. Eric Gumm Abilene Christian University NISTS 2010.
Working Toward a Statewide Information System to Track the Effectiveness of Student Aid Financial Programs in Maryland Michael J. Keller Director of Policy.
Completion Incentive Grant Fund Financial Aid Pilot Program 2012 SHEEO Higher Education Policy Conference Massachusetts Department of Higher Education.
Analysis of States’ Use of Student Enrollments and Performance Criteria in Higher Education Funding May 2012 R EPORT FOR THE N EVADA L EGISLATURE ’ S C.
Institutional Effectiveness 2010/2011 Core Indicators Institutional Research Wendy Dove – October 2011 COMMON GROUND “Progress towards a decade of student.
ARCC /08 Reporting Period Prepared by: Office of Institutional Research & Planning February 2010.
National Accountability Initiatives and Their Impact on NCCCS J. Keith Brown CCPRO Fall Conference October 18, 2010.
The Twelve Enhanced Accountability Measures and Six Performance Funding Measures Annual Report to the Board of Trustees Academic Year
ARCC Accountability Report for the Community Colleges Focus on Quality.
Accountability Reporting for Community Colleges (ARCC) 2007 Report for Cerritos College Bill Farmer and Nathan Durdella.
Cuesta College ARCC Data Report to the San Luis Obispo Community College District Board of Trustees May 5, 2010.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
Developing the Year One Report: WVC’s Experience as a Pilot College Dr. Susan Murray Executive Director, Institutional Effectiveness.
REPORT TO THE SAN LUIS OBISPO COMMUNITY COLLEGE DISTRICT BOARD OF TRUSTEES MARCH 7, 2012 SAN LUIS OBISPO COUNTY COMMUNITY COLLEGE DISTRICT INSTITUTIONAL.
REMEDIATION IN COLORADO. COLORADO GEAR UP SUCCESS.
Success is what counts. Achieving the Dream: Supporting Community College Student Success Richard Kazis Jobs for the Future Arkansas Legislative Task Force.
Developing a Student Flow Model to Project Higher Education Degree Production: Technical and Policy Consideration Takeshi Yanagiura Research Director Tennessee.
Policy Capturing Approaches to Cut Scores of College Readiness Kentucky Department of Education – CCSSO 2015 National Conference on Student Assessment.
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
New Frameworks for Strategic Enrollment Management Planning
Maryland’s Process for Projecting Enrollments in Higher Education Michael J. Keller Director of Policy Analysis and Research Maryland Higher Education.
TENNESSEE SUCCEEDS.. In the spring of 2007, the U.S. Chamber of Commerce released an education report card for all states. Tennessee received an “F” in.
Graduation Initiative 09/14/2011NISTS STEM Transfer Success Conference1 Native vs. Transfer Students at the University of Texas at San Antonio (UTSA):
Institutional Effectiveness at CPCC DENISE H WELLS.
The Future of Higher Education in Texas Dr. Larry R. Faulkner Vice-Chair, Higher Education Strategic Planning Committee Presentation to Texas Higher Education.
IS GCC MEETING ITS MISSION AND GOALS? MASTER PLANNING COMMITTEE (TEAM A) MAY 8, 2015.
Glendale Community College: Statewide Accountability Reporting Edward Karpp Associate Dean, Institutional Research & Planning January 24, 2008.
Palomar College Presentation to Palomar College Board of Trustees March 11, 2008.
Performance Funding Dilemma: Developmental Education 38 th Annual TAIR Conference February, 2016 Bret Appleton, Director Institutional Research and Data.
1 A ccountability R eporting for the C ommunity C olleges (ARCC) 2008 Presentation to Peralta Community College Board of Trustees Merritt College Office.
Summer Data Conference – June 6, (2008) National Completion Goals (Lumina’s Big Goal: 60% of those will have an Associate Degree or above.
Connecticut Community College Transfer Student Success at Connecticut State Universities and UConn John Mullane Counselor Gateway Community College
2016 AIR Pre-Conference Workshop
Satisfactory Academic Progress (SAP) Roundtable
Mission fulfillment: How do we know?
Student Success Scorecard and Institution-Set Standards 2014
Joshua Garrison Director of Policy and Legislation
Edward Karpp Dean of Research, Planning, and Grants November 15, 2010
Director of Policy Analysis and Research
Texas Higher Education Coordinating Board
Transfer Students as an Integral Part of SEM Planning
2016 Taft College Student Success Scorecard
Operations and Performance of the Virginia Community College System
Defining and Measuring Student Success Dr
Northwest RESA Principals’ Council December 7, 2011
2017 Taft College Student Success Scorecard
Texas Association of Community Colleges
SOCCCD Board of Trustees’ Meeting
Student Success Scorecard & Other Institutional Effectiveness Metrics
It’s Time to Light the Lights 2016 ILASFAA Annual Conference
Florida College System Performance Based Funding
Student Success Data.
Enhancing Accountability Alabama’s Colleges and Universities
2008 ARCC Report Findings February 2, 2009
Academic Freedom & Standards Committee
National Center for Higher Education Management Systems (NCHEMS)
Presented to the Strategic Planning Committee
Edward Karpp Dean of Research, Planning, and Grants January 17, 2012
Glendale Community College: Statewide Accountability Reporting
Student Demographics and Success Trends
Presentation transcript:

Development of Statewide Community College Value- Added Accountability Measures Michael J. Keller Director of Policy Analysis and Research Maryland Higher Education Commission 2007 SHEEO/NCES Network Conference St. Petersburg, FL May 9, 2007

Accountability Process Maryland law requires the public colleges and universities to submit annual performance accountability reports to the Maryland Higher Education Commission. Maryland law requires the public colleges and universities to submit annual performance accountability reports to the Maryland Higher Education Commission. The Commission reviews these reports and presents them with its assessment to the Governor and the General Assembly. The Commission reviews these reports and presents them with its assessment to the Governor and the General Assembly.

Main Elements of the Accountability Reports The heart of the reports consists of a series of key indicators with benchmarks. The heart of the reports consists of a series of key indicators with benchmarks. Benchmark: The multi-year desired outcome for each indicator that the institution sets for itself. Benchmark: The multi-year desired outcome for each indicator that the institution sets for itself. –Achievable –Indicative of progress –Based on the performance of similar institutions where possible –Reflective of funding Both two- and four-year institutions use this framework, although their accountability reporting requirements differ. Both two- and four-year institutions use this framework, although their accountability reporting requirements differ.

Developments in Community College Accountability Reporting In 2004, the Maryland Council of Community College Chief Executive Officers established a workteam to recommend revisions to the accountability process used by the two-year institutions. In 2004, the Maryland Council of Community College Chief Executive Officers established a workteam to recommend revisions to the accountability process used by the two-year institutions. Their goal: To identify indicators that would convey in a more meaningful way the mission of the community colleges and how well they are fulfilling it. Their goal: To identify indicators that would convey in a more meaningful way the mission of the community colleges and how well they are fulfilling it.

Reaction to the Workteam’s Proposal The workteam submitted its proposal in fall 2005 The workteam submitted its proposal in fall 2005 –Represented an extensive review of state accountability systems and indicators nationally. The proposal: The proposal: –Endorsed by the Chief Executive Officers and Maryland Association of Community Colleges. –Supported by the staff of the Maryland Higher Education Commission after a few suggested changes were made and by the analysts of the Governor’s budget office and the General Assembly. –Approved by the Commission in winter 2006 and implemented the following fall.

Elements of the Revised Community College Accountability Process The core of the community college report is a set of 32 performance measures that the institutions describe as “mission/mandate” driven. The core of the community college report is a set of 32 performance measures that the institutions describe as “mission/mandate” driven. The indicators are standard across all community colleges. The indicators are standard across all community colleges. Colleges may include additional campus-specific measures if they wish. Colleges may include additional campus-specific measures if they wish. The standard indicators are organized on the basis of seven categories. The standard indicators are organized on the basis of seven categories.

Categories of Indicators Student characteristics (descriptive only) Student characteristics (descriptive only) Accessibility and affordability Accessibility and affordability Quality and effectiveness: student satisfaction, progress and achievement Quality and effectiveness: student satisfaction, progress and achievement Diversity Diversity Economic growth and vitality and workforce development Economic growth and vitality and workforce development Community outreach and impact Community outreach and impact Effective use of public funding Effective use of public funding

Degree Progress Analysis A key feature of the new model. A key feature of the new model. The previous indicators dealing with retention, graduation and transfer rates were replaced. The previous indicators dealing with retention, graduation and transfer rates were replaced. Substituted were indicators that examined “successful persister” rate after four years and the graduation/transfer rate after four years on the basis of the readiness of students to do college-level work. Substituted were indicators that examined “successful persister” rate after four years and the graduation/transfer rate after four years on the basis of the readiness of students to do college-level work.

Successful Persister Rate Percent of first-time students attempting 18 or more hours during their first two years who: Percent of first-time students attempting 18 or more hours during their first two years who: –Graduated –Transferred –Earned at least 30 credits with a cumulative GPA of 2.0 or above –Were still enrolled The concept of the successful persister rate The concept of the successful persister rate –To provide interim measures of progress. –To capture the outcomes of community college students with goals other than earning a credential or transferring to a four- year institution.

Graduation/Transfer Rate Percent of first-time students attempting 18 or more hours during their first two years who graduated with an associate degree or a certificate and/or transferred to another institution of higher education. Percent of first-time students attempting 18 or more hours during their first two years who graduated with an associate degree or a certificate and/or transferred to another institution of higher education.

The Breakdowns of Students on the Basis of College Readiness College ready: Those not requiring any remedial coursework. College ready: Those not requiring any remedial coursework. Developmental completers: Those who needed remediation in at least one area who, after four years, completed all of the recommended coursework. Developmental completers: Those who needed remediation in at least one area who, after four years, completed all of the recommended coursework. Developmental non-completers: Those who needed remediation in at least one area who, after four years, did not finish all of the recommended coursework Developmental non-completers: Those who needed remediation in at least one area who, after four years, did not finish all of the recommended coursework

Purpose of the Breakdowns The breakdowns on the basis of college readiness of students at entry are designed to reflect the differing levels of preparation with which community college students begin their studies. The breakdowns on the basis of college readiness of students at entry are designed to reflect the differing levels of preparation with which community college students begin their studies. Separate benchmarks are established for each group of students. Separate benchmarks are established for each group of students.

Sources of the Degree Progress Data The figures for the indicators related to degree progress are based primarily on campus-provided data. The figures for the indicators related to degree progress are based primarily on campus-provided data. Numbers about transfers to Maryland private institutions and to out-of-state campuses are obtained from the National Student Clearinghouse. Numbers about transfers to Maryland private institutions and to out-of-state campuses are obtained from the National Student Clearinghouse. The community colleges are required to supply the Commission with detailed spreadsheets that allow us to verify the accuracy of the calculations. The community colleges are required to supply the Commission with detailed spreadsheets that allow us to verify the accuracy of the calculations.

Analysis of the Degree Progress Data Information exists for two cohorts of entering students: 2000 and Information exists for two cohorts of entering students: 2000 and Statewide, the four-year “successful persister” rate was 76.7 percent and 65.8 percent respectively. Statewide, the four-year “successful persister” rate was 76.7 percent and 65.8 percent respectively. In both years, the rates were very similar for college ready students and developmental completers but were considerably lower for developmental non- completers. In both years, the rates were very similar for college ready students and developmental completers but were considerably lower for developmental non- completers. The graduation/transfer rates for college-ready students and developmental completers exceeded the average for all students. The graduation/transfer rates for college-ready students and developmental completers exceeded the average for all students.

Usefulness of Degree Progress Analysis to Policy Making Raw graduation rates – and even graduation/transfer rates – provide a very limited portrait of the performance of community colleges. Raw graduation rates – and even graduation/transfer rates – provide a very limited portrait of the performance of community colleges. These measures by themselves do not take into account the goals students have when they enter or their ability to do college-level work. These measures by themselves do not take into account the goals students have when they enter or their ability to do college-level work. The information provided by the Degree Progress Analysis provides Maryland officials with a more comprehensive picture of community college outcomes than has been available in the past and will help to create a greater understanding of the mission of two-year institutions. The information provided by the Degree Progress Analysis provides Maryland officials with a more comprehensive picture of community college outcomes than has been available in the past and will help to create a greater understanding of the mission of two-year institutions. In the community college funding hearing in the last session of the General Assembly, the analysts did not repeat their long- expressed concerns about the comparatively low percentage of community college students who earned a credential or transferred to a four-year institution and they removed references to community college “drop out rates”. In the community college funding hearing in the last session of the General Assembly, the analysts did not repeat their long- expressed concerns about the comparatively low percentage of community college students who earned a credential or transferred to a four-year institution and they removed references to community college “drop out rates”.

With Questions or For More Information, Contact Dr. Michael Keller at: or