Using Results in FFY 2018 Part B Determinations

Slides:



Advertisements
Similar presentations
Six Year Plan Meeting the state targets Region Meeting August 16, 2007.
Advertisements

Rhode Island State Systemic Improvement Plan (SSIP) Stakeholder Input November 6, 2014.
Texas State Accountability 2013 and Beyond Current T.E.A. Framework as of March 22, 2013 Austin Independent School District Bill Caritj, Chief Performance.
5/2010 Focused Monitoring Stakeholders May /2010 Purpose: Massachusetts Monitoring System  Monitor and evaluate program compliance with federal.
Massachusetts Department of Elementary & Secondary Education Overview of Results Driven Accountability Assuring Compliance and Improving Results August.
State Directors Conference Boise, ID, March 4, 2013 Cesar D’Agord Regional Resource Center Program WRRC – Western Region.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
Office of Assessment October 22, PSAT/SAT/NAEP  PSAT – Pre-Scholastic Aptitude Test  SAT – Scholastic Aptitude Test  NAEP – National Assessment.
Overview of Idaho’s State Systemic Improvement Plan (SSIP) Division of Special Education Dr. Charlie Silva State Director of Special Education 1.
Pouring a Foundation for Program Improvement with Quality SPP/APR Data OSEP’s message regarding Indicators 1, 2, 13 and 14 - data collection and improvement.
Office of Special Programs WV Department of Education September 8, 2014 Office of Special Programs WV Department of Education September 8, 2014 Results.
Overview Continuous Improvement & Focus Monitoring.
SSIP Implementation Support Visit Idaho State Department of Education September 23-24, 2014.
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
These four modules were developed by the Human Development Center, LSU Health Sciences Center and the Jefferson Parish Families Helping Families with the.
An Introduction to the State Performance Plan/Annual Performance Report.
IDEA and NCLB Standards-Based Accountability Sue Rigney, U.S. Department of Education OSEP 2006 Project Directors’ Conference.
IDEA 2004 Part B Changes to the Indicator Measurement Table.
Educational Benefit Review (EBR) October Training Goals ► To define “Educational Benefit” ► To learn a process for reviewing your district’s IEPs.
State and Local Processes for Monitoring Educational Benefit
Educational Benefit Review (EBR)
Title I Part C Migrant Education Program.  By end of school year 2015, 60% of migrant students are meeting standard in Reading.
Claremont Graduate University Teacher Education Special Education Seminars Dr. Phyllis B. Harris, Executive Director Oakland Unified School District Programs.
A ccountability R esearch and M easurement Florida’s School Grading System Overview and Updates 1.
Transition Outcomes Project Data Collection for Program Improvement NSTTAC Secondary Transition State Planning Institute May 2, 2007 Ed O’Leary, MPRRC.
Texas State Performance Plan Data, Performance, Results TCASE Leadership Academy Fall 2008.
Georgia Parent Mentor Kickoff: Inform, Imagine, Inspire with Results-Driven Accountability Ruth Ryder DEPUTY DIRECTOR OFFICE OF SPECIAL EDUCATION PROGRAMS.
District Annual Determinations IDEA Part B Sections 616(a) and (e) A State must consider the following four factors: 1.Performance on compliance.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
TOM TORLAKSON State Superintendent of Public Instruction State of California Annual Performance Report Individuals with Disabilities Education Act of 2004.
State Systemic Improvement Plan (SSIP) Office of Special Education January 20, 2016.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Special Education State Performance Plan and Annual Performance.
O S E P Office of Special Education Programs United States Department of Education Aligning the State Performance Plan, Improvement Strategies, and Professional.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference, September 2014 Digging into “Data Use” Using the DaSy Framework.
State Performance Plan ESC-2 Presentation For Superintendents September 19, 2007.
School Accountability and Grades Division of Teaching and Learning January 20, 2016.
THE APR AND SPP--LINKING SPECIAL EDUCATION DATA TO ACCOUNTABILITY FOR EDUCATION RESULTS Building a Brighter Tomorrow through Positive and Progressive Leadership.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
What is it? This is a plan that describes how Rosemont Elementary School (RES) will provide opportunities to improve parent engagement to support student.
State-Defined Alternate Diplomas
Incorporating Early Childhood into Longitudinal Data Systems:
What is “Annual Determination?”
IDEA Assessment Data Anne Rainey, IDEA Part B Data Manager, Montana
A Multi-tiered Framework for Monitoring ESEA & IDEA Programs
Parent and Family Engagement Policy
Part C State Performance Plan/Annual Performance Report:
Measuring Project Performance: Tips and Tools to Showcase Your Results
Agenda 3:00 Introductions and ZOOM Webinar reminders
Federal Policy & Statewide Assessments for Students with Disabilities
Dena Hook, Vice President of Outreach
Is Your Child’s IEP Individualized?
Improving Data, Improving Outcomes Conference, September 2014
Assessment, Evaluation and Support
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Early Childhood Outcomes Data (Indicator C3 and B7)
Kentucky School for the Blind
Parent and Family Engagement Policy
Parental Involvement Policy
School Title I Stakeholder Meeting
Starting Community Conversations
Gathering Input for the Summary Statements
Measuring EC Outcomes DEC Conference Presentation 2010
Measuring Child and Family Outcomes Conference August 2008
Early Childhood Outcomes Data (Indicator C3 and B7)
Special Ed. Administrator’s Academy, September 24, 2013
Community Based Assessments
Using Data to Build LEA Capacity to Improve Outcomes
Presentation transcript:

Using Results in FFY 2018 Part B Determinations

Purpose of this Session To present OSEP’s rationale and process for including results in Entities’ Part B determinations. To respond to questions following the presentation. To get your input on: Using results when making determinations in 2018 under IDEA section 616; How we can best assist you in messaging to your Chiefs and stakeholders that determinations will be based on compliance and results data.

Rationale for Including Results: Statutory Obligation Section 616 of the IDEA requires that the primary focus of IDEA monitoring must be on improving educational results and functional outcomes for children with disabilities, and ensuring that States meet the IDEA program requirements.

Rationale for Including Results: Promotes Purposes of IDEA Equality of opportunity Educational progress and high expectations for all children Full participation in the community Independent living Economic self-sufficiency

Rationale for Including Results: Revised Accountability System RDA Balancing educational results and functional outcomes with compliance requirements Part B determinations Using both results data and compliance data

Rationale for Including Results: Data % Graduated with a regular HS diploma % Dropped out % Participation of Children with Disabilities (IDEA) in Statewide Assessments

Process for Including Results Include the following results data in determinations: Graduation Drop out Participation in assessments Reading and math at two grade levels Plus NAEP for BIE

Process for Including Results Compliance Data Points Entity 4B 9 10 11 12 13 Timely Accurate Data Timely State Complaints Timely DPH Long Standing Noncompliance Data Points Possible Points (x2) A   X 5 B 4 8 C 6 D E 7 14 F G H 16

Process for Including Results: Results Data Points 1 2 3B: Reading 3B: Math Data Points ALL ENTITIES Possible Points (x2) % Graduated with a Regular HS Diploma % Dropped out % Participation Regular Assessment   6 12 4th Grade 8th Grade

TOTAL POINTS AVAILABLE Process for Including Results: An Example Results and Compliance Overall Scoring   TOTAL POINTS AVAILABLE POINTS EARNED SCORE RESULTS 12 6 50 COMPLIANCE 10 100 TOTAL: 150

Results Score/Compliance Score Process for Including Results An Example Compliance and Results Overall Scoring Results Score/Compliance Score 50/50 40/60 30/70 75% 80% 85%

Process for Including Results: A Phased-In Approach Purpose: Allow time for entities to transition from a focus on compliance to a focus on results and compliance. Calculate the RDA Percentage by adding: 30% of the Entity’s Results Score and 70% of the Entity’s Compliance Score in year one. 40% of the Entity’s Results Score and 60% of the Entity’s Compliance Score in year two. 50% of the Entity’s Results Score and 50% of the Entity’s Compliance score in year 3 and subsequent years.

Questions? Kuestion-mu? Kajjitok? Fesili?

Next Steps Submit suggestions for how to include results in determinations directly to your State Lead or anonymously to your TA provider by August 19. Let your State Lead know how OSEP can assist you in messaging to your Chiefs and stakeholders that determinations will be based on compliance and results data. Send any questions you have about this information to your State Lead. OSEP will respond to your questions, provide support in informing your Chiefs, and consider your suggestions for how to include results in the FY 2018 determinations. THANK YOU!