Pouring a Foundation for Program Improvement with Quality SPP/APR Data OSEP’s message regarding Indicators 1, 2, 13 and 14 - data collection and improvement.

Slides:



Advertisements
Similar presentations
Angela Tanner-Dean Diana Chang OSEP October 14, 2010.
Advertisements

This document was developed by the National Post-School Outcomes Center, Eugene, Oregon, (funded by Cooperative Agreement Number H326U090001) with the.
Pre-test Please come in and complete your pre-test.
Special Education Director’s Conference Sept. 29, 2006 Prepared by Sharon Schumacher.
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
State Directors Conference Boise, ID, March 4, 2013 Cesar D’Agord Regional Resource Center Program WRRC – Western Region.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
NC SSIP: 5 Things We’ve Learned Directors’ Update March 2015 ncimplementationscience.ncdpi.wikispaces.net/Recent+Presentations.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
Evaluating SPP/APR Improvement Activities Presented by Jeanna Mullins, Mid-South Regional Resource Center, RRCP Document developed by members of the Systems.
Part B Indicator 13 FFY 09 SPP/APR Writing Suggestions Western Regional Resource Center APR Clinic 2010 November 1-3 San Francisco, California.
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
Monitoring Accommodations in South Dakota Linda Turner Special Education Programs.
A Model for Collaborative Technical Assistance for SPP Indicators 1, 2, 13, & 14 Loujeania Bost, Charlotte Alverson, David Test, Susan Loving, & Marianne.
Local Contributing Factor Tool for SPP/APR Compliance Indicators C-1, C-7, C-8, C-9/B-15, B-11 and B-12: Collecting and Using Valid and Reliable Data to.
Working with Your RRC to Improve Secondary Transition Education Presented by: Lucy Ely Pagán, NERRC and Jeanna Mullins, MSRRC.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
CALIFORNIA DEPARTMENT OF EDUCATION Jack O’Connell, State Superintendent of Public Instruction State Performance Plan (SPP) & Annual Performance Report.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
Welcome to the Regional SPR&I trainings Be sure to sign in Be sure to sign in You should have one school age OR EI/ECSE packet of handouts You.
Timeline Changes and SPR&I Database Updates SPR&I Fall Training Day Two.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
1 Accountability Conference Education Service Center, Region 20 September 16, 2009.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
Overview to Measuring Early Childhood Outcomes Ruth Littlefield, NH Department of Education Lynne Kahn, FPG Child Dev Inst November 16,
SPR&I: Changes, New Measures/Targets, and Lessons Learned from Focused Monitoring Visits David Guardino, SPR&I Coordinator Fall 2009 COSA Conference.
An Introduction to the State Performance Plan/Annual Performance Report.
Letter of Explanation Copy of Data Disproportionality Initial Eligibility 60-day Timeline Early Childhood Transition Secondary Transition Corrected and.
Nash-Rocky Mount Public Schools Programs for Exceptional Children State Performance Plan/ Annual Performance Report/Continuous Improvement Performance.
National High School Center Summer Institute What’s the Post-School Outcomes Buzz? Jane Falls Coordinator, National Post-School Outcomes Center Washington,
SPP/APR - SSIP Stakeholders Meeting # 5. Agenda for Today Stakeholder involvement Review Draft SSIP –Baseline Data / Target setting –Introduction –Data.
IDEA 2004 Part B Changes to the Indicator Measurement Table.
Indicator 14 Frequently Asked Questions Frequently Asked Questions Revised May 2010 (Revisions indicated in red font)
7 Due Process Update Evaluation Standards. Evaluation Materials and Procedures.
New Indicator 14 Frequently Asked Questions Frequently Asked Questions 3 rd Annual Secondary Transition State Planning Institute Charlotte, NC May12-14,
Michigan School Report Card Update Michigan Department of Education.
Noncompliance and Correction (OSEP Memo 09-02) June 2012.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Dan Schreier, Gregg Corr, Jill Harris, Ken Kienas, Kate Moran,
District Annual Determinations IDEA Part B Sections 616(a) and (e) A State must consider the following four factors: 1.Performance on compliance.
YEAR #2 DETERMINATIONS ISD Special Education Directors’ Meeting September 18, 2008.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
JACK O’CONNELL State Superintendent of Public Instruction Improving Special Education Services November 2010 Sacramento, CA SPP/APR Update.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Special Education State Performance Plan and Annual Performance.
January 2012 Mississippi Department of Education Office of Instructional Enhancement and Internal Operations/Office of Special Education 1 Noncompliance.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
LEA Self-Assessment LEASA: Presentations:
1 Early Intervention Monitoring Wyoming DDD April 2008 Training.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
KCMP Quarter 3 Indicators 1, 2, 4, and 20 November - January.
What is “Annual Determination?”
IDEA Assessment Data Anne Rainey, IDEA Part B Data Manager, Montana
Evaluating SPP/APR Improvement Activities
Part C State Performance Plan/Annual Performance Report:
OSEP Project Directors Meeting
Agenda 3:00 Introductions and ZOOM Webinar reminders
Guam Department of Education
G-CASE Fall Conference November 14, 2013 Savannah, Ga
SPR&I Regional Training
Early Childhood Transition APR Indicators and National Trends
Using Data for Program Improvement
Using Data for Program Improvement
Evaluating SPP/APR Improvement Activities
Measuring Child and Family Outcomes Conference August 2008
Special Ed. Administrator’s Academy, September 24, 2013
Presentation transcript:

Pouring a Foundation for Program Improvement with Quality SPP/APR Data OSEP’s message regarding Indicators 1, 2, 13 and 14 - data collection and improvement strategies Ruth Ryder USDE/OSEP/MSIP

Updates Status of revisions to information collection (Indicator/Measurement Table) –Received recommendations that Indicator 13 be revised Longer Shorter Tweaked –In process

Updates Status of OSEP’s review of the APR and revised SPP submissions –Opportunity for Clarification –Response Table –Determinations –Letters in early June

OSEP Review Process State contacts did initial review Division did facilitated review Division leadership “triaged” all Status Tables Opportunity for Clarification Developing Response Tables

Indicators 1 and 2 Only a few States continued to do the comparison to all youth Many States are using 618 State-reported data Many States revised their improvement activities, usually adding more specific activities for “out years”

Indicator 1 and 2 Issues Some States could not provide data (05-06 data were provided) Great variations in calculation methodologies (more using cohort) Not even close to meeting targets Improvement activities – (“kitchen sink” approach or minimalist approach)

Indicator 13 All States submitted data –We questioned the validity and reliability of a few States State compliance ranged from 4.9% to 100% –About 15 States were below 50% compliance –Many States could not demonstrate timely correction of previously identified noncompliance

Indicator 13 Issues What exactly are States reporting to us? –More than half of the States are using the NSTTAC checklist or some variation –Remaining States are using their own checklists and it’s often hard to tell what requirements they are evaluating What does timely correction look like for this indicator?

Indicator 14 With a few exceptions, States were able to give us data About 8 States did not provide valid and reliable data –Denominator –Only graduates

Indicator 14 Issues What do the reported data represent? –Many States did not describe the respondent group –Can’t determine if the respondent group is representative of the population Small sample sizes Improvement activities focus on data collection

The Challenges: 2007 From our review of the Feb 2007 submissions we identified patterns of challenges – –The Basics – Data –Compliance –Improvement

The Successes and Challenges: 2008 Successes –The Basics – Much better, States provided the required information, etc. –Data – Much better, correct measurement, correct year –Compliance – More accurate data, more evidence of timely correction –Improvement Activities – Many States revised and/or added

The Successes and Challenges: 2008 Challenges –The Basics – Keep up the good work! –Data – Reconciling database data with monitoring system data, calculation methodologies for 1 and 2 –Compliance – Documenting timely correction, improving performance –Improvement Activities – Purposeful, linked, sequenced, evidence-based

Improvement Activities: External TA Analysis Categories Improve data collection and reporting Improve systems administration and monitoring Build systems and infrastructures of technical assistance and support Provide technical assistance/training/ professional development

(Continued) Clarify/examine/develop policies and procedures Program development Collaboration/coordination Evaluation Increase/adjust FTE

One State’s Perspective on Making the Grade with the SPP/APR Attend as many OSEP-funded TA offerings as possible Provide accurate and reliable data and if can’t, explain why and what you’re doing about it Analyze data by local programs Develop standard headings, stems and data formats to use for all indicators

One State’s Perspective on Making the Grade with the SPP/APR Maintain documentation that you: –Identify noncompliance at the local level –Identify research-based improvement activities that are a match to the identified problems –Require and approve corrective action plans with appropriate timelines –Oversee timelines and require proof of correction (evidence of success)

Examples

What You’re Doing is Working! From 1987 to 2003: –Postsecondary enrollment rose from 15% to 32% –4-year college enrollment rose from 1% to 9%

What You’re Doing is Working! More academic coursework More above-average grades More congruency between age and grade level More support services

What You’re Doing is Working! More students with disabilities are exiting with a standard diploma 1996 to 2006, rates rose from 42% to 56% Fewer students with disabilities are dropping out From , rates declined from 47% to 26%