December 15, 2014 ESEA Flexibility Analysis. The flex analysis was designed to examine the characteristics of schools identified by each SEA’s differentiated.

Slides:



Advertisements
Similar presentations
Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Advertisements

Title I/AYP Presentation Prepared by NHCS Title I Department for NHCS PTA September 22, 2010.
1 Requirements for Focus Schools Focus Schools Conference Presenter: Yvonne A. Holloman, Ph.D. September 17-18, 2012.
Presented to the State Board of Education August 22, 2012 Jonathan Wiens, PhD Office of Assessment and Information Services Oregon Department of Education.
‘No Child Left Behind’ Loudoun County Public Schools Department of Instruction.
ESEA FLEXIBILITY WAIVER Overview of Federal Requirements August 2, 2012 Alaska Department of Education & Early Development.
ESEA FLEXIBILITY WAIVERS Gayle Pauley Assistant Superintendent Special Programs and Federal Accountability
Feeder Student Data File Instructions for Filtering & Usage Guidelines.
Rhode Island Accountability Process Revisions for School Years 2015 and 2016 A Presentation to the Accountability 3.0 Statewide Webinar March 27, 2015.
New DC OSSE ESEA Accountability. DC OSSE ESEA Accountability Classification Overview I. DC OSSE Accountability System II. Classification of Schools III.
4 Principles of ESEA Flexibility 1 January College-and-Career-Ready Expectations for All Students ( ) 2.State-Developed Differentiated Recognition,
Flexibility in Determining AYP for Students with Disabilities Background Information—Slides 2—4 School Eligibility Criteria—Slide 5 Calculation of the.
ESEA FLEXIBILITY: RENEWAL PROCESS November 20, 2014.
Carolyn M. Wood - Assistant State Superintendent Division of Accountability, Assessment, and Data Systems October 31,
An Evaluation of Data from the Teacher Compensation Survey: School Year June, 2010 Stephen Q. Cornman Frank Johnson.
ESEA FLEXIBILITY: QUESTIONS AND ANSWERS October 5, 2011.
MEGA 2015 ACCOUNTABILITY. MEGA Conference 2015 ACCOUNTABILITY MODEL INFORMATION SUBJECT TO CHANGE The Metamorphosis of Accountability in Alabama.
Introduction to Adequate Yearly Progress (AYP) Michigan Department of Education Office of Psychometrics, Accountability, Research, & Evaluation Summer.
Review Planning Faribault Public Schools DATA DAY.
September 26, 2006 Schools in NCLB Restructuring: National Trends Kerstin Carlson Le Floch James Taylor Yu Zhang.
Ross Santy, U.S. Department of Education Beth Young, Quality Information Partners.
Springfield Public Schools Adequate Yearly Progress 2010 Overview.
1 Adequate Yearly Progress Fresno Unified School District 2005 Data Review.
Know the Rules Nancy E. Brito, NBCT, Accountability Specialist Department of Educational Data Warehouse, Accountability, and School Improvement
2015 Texas Accountability System Overview and Updates August 13, 2015.
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
DRAFT Title I Annual Parent Meeting Elliott Point September 15, 2015 Janet Norris.
July,  Congress hasn’t reauthorized Elementary & Secondary Education Act (ESEA), currently known as No Child Left Behind (NCLB)  U.S. Department.
ESEA Flexibility: Gap Reduction Maryland Accountability Program Presentation 5 of 8.
Student Achievement Gains and Gaps in Saint Paul Public Schools Tom Watkins Director of Research, Evaluation and Assessment Saint Paul Public Schools May.
1 Requirements for Focus Schools Contractors’ Meeting March 4, 2013 Presenter: Yvonne A. Holloman, Ph.D.
Pennsylvania’s ESEA Flexibility Proposal May 23, >
MIS DATA CONFERENCE 2012 JULY 23, 2012 Mississippi Department of Education Office of Federal Programs.
No Child Left Behind Adequate Yearly Progress (AYP) Know the Rules Division of Performance Accountability Dr. Marc Baron, Chief Nancy E. Brito, Instructional.
Progress Update 1. Achievement Trends Change in % Proficient/Advanced.
School Accountability in Delaware for the School Year August 3, 2005.
Lodi Unified School District Accountability Progress Report (APR) Results Update Prepared by the LUSD Assessment, Research & Evaluation Department.
2011 Achievement Gaps By Various Subgroups: Reading and Math EOG Winston-Salem/Forsyth County Schools Board of Education October 11, 2011.
Making Sense of Adequate Yearly Progress. Adequate Yearly Progress Adequate Yearly Progress (AYP) is a required activity of the No Child Left Behind (NCLB)
Fall Regional Curriculum and Instruction Meeting September 2015.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
NCLB / Education YES! What’s New for Students With Disabilities? Michigan Department of Education.
ESEA Federal Accountability System Overview 1. Federal Accountability System Adequate Yearly Progress – AYP defined by the Elementary and Secondary Education.
Adequate Yearly Progress (AYP) for Special Populations Michigan Department of Education Office of Educational Assessment and Accountability Paul Bielawski.
McKinney-Vento Education for Homeless Children and Youth Program (EHCY) Improving the Quality of LEA Level Data February 28, 2013 Prepared for: Office.
2012 MOASBO SPRING CONFERENCE Missouri Department of Elementary and Secondary Education 1 April 26, 2012.
MCC MCA Data Discoveries. What does Minnesota think is important? What do we want kids to do?  Pass important tests “Be Proficient”  Grow.
NHCS READY Report October READY Annual Report Contents Growth Proficiency: Detail for Grades 3- 8 and High School Progress: Annual Measurable Objectives.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
No Child Left Behind California’s Definition of Adequate Yearly Progress (AYP) July 2003.
1 Mississippi Statewide Accountability System Adequate Yearly Progress Model Improving Mississippi Schools Conference June 11-13, 2003 Mississippi Department.
Novice Reduction & Non-Duplicated Gap Group
Updates on Oklahoma’s Accountability System Jennifer Stegman, Assistant Superintendent Karen Robertson, API Director Office of Accountability and Assessments.
Data for the 2000 and 2001 Cohorts February 2006.
American Education Research Association April 2004 Pete Bylsma, Director Research/Evaluation/Accountability Office of Superintendent of Public Instruction.
ESEA FLEXIBILITY WAIVERS December 2, 2011 House Education Committee Bob Harmon, Assistant Superintendent
Adequate Yearly Progress (AYP). What is Adequate Yearly Progress (AYP)? As a condition of receiving federal funds under No Child Left Behind (NCLB), all.
Adequate Yearly Progress [Our School District]
Determining AYP What’s New Step-by-Step Guide September 29, 2004.
Every Student Succeeds Act (ESSA) Accountability
Every Student Succeeds Act (ESSA) State Plan: Update
Sustaining and building on the excellence of LCPS
IDEA Assessment Data Anne Rainey, IDEA Part B Data Manager, Montana
2016 READY ACCOUNTABILITY DISTRICT RESULTS
Analysis and Reporting, Accountability Services
Media Briefing School Progress 2012 Results Mary L. Gable- Assistant State Superintendent Division of Academic Policy July 10, 2012 Meeting the requirements.
Maryland State Board of Education October 25, 2011


Presentation transcript:

December 15, 2014 ESEA Flexibility Analysis

The flex analysis was designed to examine the characteristics of schools identified by each SEA’s differentiated accountability system for the and school years, including the performance of all students and all subgroups based on, respectively, and student achievement and graduation rate data Purpose of the ESEA Flexibility Analysis 2

 Not a replication of individual state identification systems  An examination of school characteristics at the time of school identification: for Windows 1 and 2; for Windows 3 and 4.  One tool that states can use to analyze whether their identification systems worked as intended to capture the lowest-performing schools and subgroups CAVEATS: The Flexibility Analysis is… Purpose of the ESEA Flexibility Analysis (continued) 3

 Student achievement in reading and mathematics (proficiency rates and AMOs) for ESEA and combined subgroups  Schools and subgroups performing at or below the 5 th percentile  Schools with large subgroup proficiency gaps  Schools and subgroups meeting AMO targets  Graduation rates and targets for ESEA and combined subgroups  Schools and subgroups with graduation rates below 60 percent  Schools with large subgroup graduation rate gaps  Schools and subgroups meeting graduation rate targets  Performance against the 95 percent participation rate target on state assessments for ESEA and combined subgroups Examine the relationship between school identification and: Focus of the ESEA Flexibility Analysis 4

 Uses data to produce 14 analyses/exhibits for each state profile  data for: year 1 profiles, Windows 1 and 2  data for: year 1 profiles, Windows 3 and 4; year 2 profiles, Windows 1 and 2  Data Quality Checks & Extensive Data Outreach to States  ED – through the EDFacts Partner Support Center (PSC) – contacted specific states that had large amounts of missing or low-quality data Examples include: large percentage of operational schools missing Title I participation or eligibility status, not reporting graduation rate indicator data, not reporting reading or mathematics data for ESEA subgroups  Missing or low-quality data submitted by states may result in:  Exclusion of an analysis/exhibit from a state’s profile Explanations for these exclusions are provided in the cover letter to each state  Exclusion of schools from a specific analysis/exhibit Explanations for these exclusions are provided in the technical notes for each exhibit Description of the ESEA Flexibility Analysis 5

Priority and Focus School Identification 6

Priority and Focus School Identification by School Level 7

School Identification by State-Defined Status Levels 8

Characteristics Schools Identified as Priority or Focus for 2012–13 All Other Title I Participating Schools School Level (Percentage of Schools) Elementary67.3%70.6% Middle15.3%17.1% High12.0%8.6% Non-standard a 5.3%3.8% Total100.0% School Type (Percentage of Schools) Regular90.7%96.6% Alternative7.3%2.6% Special education1.3%<1% Vocational<1% Total99.3%99.2% Charter School Status (Percentage of Schools)8.7%6.7% Exhibit 4. At the time of identification, what were the demographic characteristics of priority and focus schools compared to all other Title I participating schools? Distribution by School Characteristics 9

Exhibit reads: In STATE, 67 percent of Title I participating schools identified as priority or focus for 2012–13 were elementary schools, compared to 71 percent of all other Title I participating schools. Source: 2011–12 EDFacts, Data Group (DG) 18: Grades offered, DG 21: School type, DG 27: Charter status, DG 39: Membership, DG 74: Children with disabilities (IDEA) school age, DG 123: LEP students in LEP program, DG 565: Free or reduced-price lunch; 2012–13 EDFacts, DG 34: Improvement status - school (n = 1,000 Title I participating schools [150 Title I participating schools identified as priority or focus and 850 all other Title I participating schools]) Note: Technical notes for this exhibit appear in the Appendix. Characteristics Schools Identified as Priority or Focus for 2012–13 All Other Title I Participating Schools Urbanicity (Percentage of Schools) Large or middle-sized city48.7%22.1% Urban fringe and large town36.7%43.1% Small town and rural area14.7%34.8% Total100.0% Percentage of Students by Race/Ethnicity American Indian1.9%2.6% Asian2.7%2.4% Black40.0%21.8% Hispanic28.2%20.9% White24.4%49.2% Total b 97.3%97.0% Percentage of Students Eligible for Free or Reduced-Price Lunch79.5%62.4% Percentage of Students with Disabilities14.1%12.0% Percentage of Limited English Proficient Students c 15.0%10.5% Average Total School Enrollment Exhibit 4. At the time of identification, what were the demographic characteristics of priority and focus schools compared to all other Title I participating schools? Distribution by School Characteristics (continued) 10

Low Performance Among Priority, Focus, and All Other Title I Schools 11

Large Subgroup Gaps Among Priority, Focus, and All Other Title I Schools 12

Low Graduation Rates Among Priority, Focus, and All Other Title I Schools 13

Large Subgroup Graduation Rate Gaps Among Priority, Focus, and All Other Title I Schools 14

AMO Status Among Priority, Focus, and All Other Title I Schools 15

Participation Rate Status Among Priority, Focus, and All Other Title I Schools 16

Status on Graduation Rate Targets: Priority, Focus, and All Other Title I Schools 17

 Each profile is accompanied by an Excel-file data extract that includes: Data sources, retrieval dates, and a data summary Data summary includes: list of all variables, data quality indicators, indicators that flag schools that are included in or excluded from each of the exhibits, and step-by-step instructions for re-creating selected multi-step descriptive analyses from the profile  See demonstration using example profile DATA EXTRACTS ESEA Flexibility Analysis Data Extracts 18

Example Data Extract 19

Example Data Extract (continued) 20

 Profiles will be sent to groups of states in batches  The first batch of profiles will be sent on December 17th  After each release, states will have 10 business days to respond with any technical corrections that may be needed  Technical Assistance Process  State flex leads should submit questions to OSS state leads in writing;  PPSS staff will review and respond to technical questions in writing within 1-2 business days  If clarifications are still needed, the OSS state lead will schedule a call between PPSS and individual state flex leads NEXT STEPS 21

Questions? 22