A Guide to Analyzing PrOF Instructional Data Packets CRC Research Office 2009.

Slides:



Advertisements
Similar presentations
Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Advertisements

Minus Grades Implementation. Minus Grades Policy On December 14, 2006 the UF Faculty Senate voted to implement minus grades. The decision was made in.
Success is what counts. A Better Way to Measure Community College Performance Presentation about the ATD Cross-State Data Workgroup NC Community College.
Penn State Worthington Scranton Challenges and Opportunities Penny Carlson Executive Director Academic Services and Assessment.
High Risk Factors for Retention Freshman Year Experience Review of the Literature Review of Preliminary Data.
Data Analysis State Accountability. Data Analysis (What) Needs Assessment (Why ) Improvement Plan (How) Implement and Monitor.
California Community Colleges Student Success 2014 Scorecard 2014 Scorecard College of the Desert Board of Trustee Presentation Dec. 19, 2014.
Accountability Reporting for the Community Colleges 2010 Report: Moreno Valley College Calculation presented by presented by David Torres, Dean Institutional.
Benchmark assessment data generated from the Edusoft Assessment Management System is a powerful tool for informing classroom instruction, and ensuring.
STUDENT EQUITY PLAN PROGRESS PRESENTATION TO BOARD FEBRUARY 28, 2012.
Facilitators of School Improvement February 2013 Dr. Jennifer Parker-Moore Macomb ISD.
Academic Excellence Indicator System (AEIS) Report Presented to ACISD Board of Trustees 12/15/2011 ARANSAS COUNTY ISD – A TEA RECOGNIZED SCHOOL.
Introduction to Adequate Yearly Progress (AYP) Michigan Department of Education Office of Psychometrics, Accountability, Research, & Evaluation Summer.
BUSINESS ADMINISTRATION & REAL ESTATE WEST VALLEY COLLEGE SARATOGA, CA MAY 14, 2010 Business Advisory Board Presentation.
Data Analysis of Sweetwater High School Presented by: LeLycia Henderson & Zorayda Delgado.
Academic Attainment in California Community Colleges: Racial And Ethnic Disparities in the ARCC 2.0/Scorecard Metrics Tom Leigh Alice van Ommeren.
ARCC /08 Reporting Period Prepared by: Office of Institutional Research & Planning February 2010.
Regular Versus Shorter University Orientations: A Comparison of Attendee Make-up Carla Abreu-Ellis & Jason Brent Ellis.
A Comprehensive Analysis of a PrOF Instructional Data Packet To illustrate the data analysis process CRC Research Office 2009.
1 Results for Students and Individuals with Disabilities September 2008.
Bringing Vision and Change to Tulsa Community College (TCC) Biology Department, Tulsa Community College Tulsa, Oklahoma Melissa Masse, Jennifer Kneafsey,
Student Performance Profile Study: An Examination of Success and Equity Matt Wetstein, Interim Vice President of Instruction Office of Planning, Research,
Peralta Community College District Open Forum 2015 Strategic Plan January 21, 2015.
PrOF Training Brad Brazil Kathy McLain Norv Wellsfry.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
1 Graduation Rates: Students Who Started 9 th Grade in 2000, 2001, and 2002.
1. ACCJC INSTITUTION-SET STANDARDS DISCUSSION 2 A “standard” is the level of performance set by the institution to meet educational quality and institutional.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
2014 Student Success Scorecard PaRC Presentation May 7, 2014 E. Kuo FH IR&P *Formerly known as the Accountability Reporting for Community Colleges (ARCC)
Tennessee Higher Education Commission Fall 2012 Enrollment & Completion Update Tennessee Higher Education Commission Fall Quarterly Meeting November 15,
ESC Region XI Module Two B Studying Local Data for Region XI Fort Worth Partners All AVATAR artifacts :
1 System Level Accountability Measures Sept. 17, 2003.
Brand Review SP’09 “Get to Know MSJC Students from the Perspective of the Facts and the Stats” Key Focus Areas: Headcount Key Focus Areas: Headcount MSJC.
Lodi Unified School District Accountability Progress Report (APR) Results Update Prepared by the LUSD Assessment, Research & Evaluation Department.
ESEA Federal Accountability System Overview 1. Federal Accountability System Adequate Yearly Progress – AYP defined by the Elementary and Secondary Education.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Pierce College Students: Who They Are & How They’re Doing Carol Kozeracki Student Success Mini-Conference February 9, 2007.
Historical Overview of Strategic University Indicators Presentation to the University Faculty Senate March 2008.
Evaluation Institute Qatar Comprehensive Educational Assessment (QCEA) 2008 Summary of Results.
1 Grade 3-8 English Language Arts Results Student Growth Tracked Over Time: 2006 – 2009 Grade-by-grade testing began in The tests and data.
Graduation Initiative 09/14/2011NISTS STEM Transfer Success Conference1 Native vs. Transfer Students at the University of Texas at San Antonio (UTSA):
2009 Grade 3-8 Math Additional Slides 1. Math Percentage of Students Statewide Scoring at Levels 3 and 4, Grades The percentage of students.
Texas Higher Education Coordinating Board Data Highlight: Completion CAAP Meeting March 30,
Palomar College Presentation to Palomar College Board of Trustees March 11, 2008.
On-line Instruction in the Los Angeles Community Colleges George Prather, Ph.D. Stanislav Levin, M.S. Edward Pai, Ph.D Office of Institutional Research.
Fall Enrollment by Ethnic Group and Year Grant Campus Year ETHNIC GROUP Nonresident alien Hispanic/Latino American Indian or Alaska Native Asian Black.
INSTITUTIONAL RESEARCH FAST FACTS ACADEMIC YEAR Source:
Report of Achieving the Dream Data Team
Student Equity Report
Riverside City College Overview
2016 Taft College Student Success Scorecard
Annual Report on Faculty to MCEF
IPEDS COMPARISON FALL 2010 – FALL 2014
Mark P Chisholm September 5, 2018
2017 Taft College Student Success Scorecard
Board of Trustees Review
Student Success Scorecard & Other Institutional Effectiveness Metrics
Texas Academic Performance Report (TAPR)
IEPI – Participate | Collaborate | Innovate
Student Success Data.
ECHOLS COUNTY MIDDLE SCHOOL April 12, 2016 Middle School Teachers
Our Students March 15, 2012.
Data Overview Sandtown Middle School
Linda DeAngelo CIRP Assistant Director for Research
MSJC Demographics AY 2007-’08
Report of Achieving the Dream Data Team
Fall 2018 & Winter 2019 AB 705 Results.
Student Demographics and Success Trends
USG Dual Enrollment Data and Trends
Riverside City College Overview
Presentation transcript:

A Guide to Analyzing PrOF Instructional Data Packets CRC Research Office 2009

Background Information

The PrOF data packets have been developed using information contained in the PeopleSoft Student Information System. The data packets show the student enrollment, demographic, academic success, semester-to-semester persistence as well as departmental WSCH/Instructional FTE/Productivity information for the past four academic years. The data, which is presented both graphically and numerically, provides information that will assist departments identify trends and differences and to compare departmental data with college-wide data. These trends and comparisons should inform the identification of strengths, opportunities and planning ideas that will enhance program effectiveness. If you have any questions about the information contained in these packets, please contact the College Research Office at (916) Overview

Departmental dataCollege-wide data Differences, Changes and/or Commonalities “Looking back” at what happened A Guide to Data Analysis for Instructional Programs The PrOF data packets are arranged so you can look at trends within your departmental data and compare it with the College as a whole. In many cases, you might find that your departmental trends closely mirror overall College-wide trends, but you may see that your departmental trends differ greatly from the College-wide data. This may have implications for departmental planning.

Student Access and DemographicsStudent Success A Guide to Data Analysis for Instructional Programs The PrOF data packets graphically and numerically represent each of the demographic and outcome measures listed above. The past four academic years are analyzed and displayed in the charts to allow you to track trends over time. Departmental Student Enrollment by: Age group Age group (collapsed) Gender Ethnic group Educational goal Educational level Instructional mode Course level Freshman status English primary language Departmental Average Course Success Rates by: Age group Age group (collapsed) Gender Ethnic group Educational goal Educational level Instructional mode Course level Freshman status English primary language Semester-to-semester persistence rates Departmental WSCH/Instructional FTE/Productivity Degree and/or Certificates Awarded

GLOSSARY OF TERMS Program Review Overview and Forecasting (PrOF) Knowing the following terms will help you with your data analysis: Department - the grouping of courses that are related in content. Course Success Rate - the average percent of students who successfully complete a class with a grade of "A", "B", "C" or "CR" compared to the overall number of students enrolled in the class. (Students who dropped out before the fourth week of classes are automatically excluded from the calculation.) Numerator = Number of students (duplicated) with A, B, C, CR Denominator = Number of students (duplicated) with A, B, C, D, F, CR, NC, W, I Persistence - the percentage of students who enroll in a particular department (regardless of course outcome) for a given semester that enroll at the college in the subsequent semester.

GLOSSARY OF TERMS (cont.) Program Review Overview and Forecasting (PrOF) Duplicated Enrollment - the number of total enrollments in a particular department. A student is counted for every individual enrollment in a particular department during a given term; in other words, if a student enrolls in three courses in a given department for a given term, they are counted three times. WSCH – acronym for Weekly Student Contact Hours. This is the total student contact hours for the semester. FTE – acronym for Full-Time Equivalent. A professor teaching a full load would be considered to be 1.00 FTE. Professors teaching overload or having a reduced teaching load for a given semester are adjusted accordingly. Productivity – the result of dividing the total FTE into the total WSCH.

Analyzing the Data

The Big Picture As you review your data – Look for trends, patterns or interesting differences in your program/department data – Look for trends, patterns or interesting differences when your data is compared to college-wide data – Think about factors that might contribute to these trends or differences (scheduling, new interventions, new course design, etc.) – Think about challenges that might be contributing to these trends or differences (facilities, decreased FTE, changes in curriculum, scheduling or instructional mode, etc.) These trends, patterns, differences, factors and challenges should inform the identification of program strengths, opportunities and planning ideas in PrOF. The Big Picture

Identifying Trends Within your data – Increases over the past four years (upward tendency in the graph) – Decreases over the past four years (downward tendency in the graph) – Cycles in the data (an up and down pattern in the graph) – Noticeable changes over a shorter time period may warrant further investigation, particularly if present on multiple slides Examples

This graph shows that the department is experiencing an increase in the percentage of African American and Hispanic students and a corresponding decrease in the percentage of Asian/Pacific Islander and White students. A Guide to Data Analysis for Instructional Programs

This graph shows that course success have improved for both modes over the past two years. Course success rates in online courses were slightly higher than other types of classes in 08-09, something that was not true in previous years. It should be noted, however, that a small number of online classes in the department may exaggerate observed trends. A Guide to Data Analysis for Instructional Programs

This graph shows a cycle of greater fall enrollments compared with spring and indicates an overall pattern of increasing unduplicated enrollments. A Guide to Data Analysis for Instructional Programs

Identifying Differences Within your data – Look for group(s) for which the data exceeds or is below the data for other groups – Look for years where the data differs from the other years – Look for data points that don’t follow an observed trend When comparing your data with College-wide data – Look for trends that differ from College-wide trends – Look for situations where program data exceeds or is less than College-wide data Examples

The fluctuation between the Fall 07 and Spring 08 headcount is much smaller than the other fluctuations, a pattern that did not continue during the next academic year. A Guide to Data Analysis for Instructional Programs

This graph shows the department’s course success rates by the student’s enrollment status (whether or not the student was a “first-time” freshmen). Course success rates have varied over the four years. However, first-time freshmen course success rates were slightly lower compared with other students for all years prior to A Guide to Data Analysis for Instructional Programs

Comparing the department data with college-wide data shows that the department is serving a younger student clientele compared to the rest of the college (note that the scales on the two graphs are not the same). DepartmentCollege wide

The department’s course success rates for African American student are greater, and have increased more, than college-wide course success rates for the same group. In addition, departmental course success rates for White students have increased, whereas college-wide course success rates have decreased. The variation in the departmental data for American Indian students may reflect the low number of students from this group taking classes in the department, which may exaggerate observed trends. A Guide to Data Analysis for Instructional Programs DepartmentCollege wide

Making Meaning from the Trends and Differences

Implications of the Data Program strengths can be identified from – Increases/upward trends within the departmental data (overall or in one group) – Areas in which the departmental data exceeds college- wide data – Differences within the departmental data Opportunities can be identified from – Decreases/downward trends in the departmental data – Areas in which the departmental data is below college- wide data – Differences within the departmental data – Factors that might be limiting the growth and/or the success of students in the department.

Generating Planning Ideas Extending or expanding programs and/or changes that may have contributed to program strengths or improvements Identifying and addressing the factors that might be negatively affecting growth or success in the department Identifying and planning to implement best practices within the department or from other institutions that are similar to CRC. After analyzing your Department’s Program Review Data Packets, you may be able generate planning ideas by: