2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 State Monitoring Under IDEA A Snapshot of Past Practices.

Slides:



Advertisements
Similar presentations
(Individuals with Disabilities Education Improvement Act) and
Advertisements

Checking & Corrective Action
State Systemic Improvement Plan: Preparing, Planning, and Staying Informed Presentation to Louisiana ICC July 10, 2013.
Campus Improvement Plans
Continuing QIAT Conversations Joan Breslin Larson Follow up webinar post Feb for AT Conference for AT Teams Hosted by Oklahoma.
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
1 Determinations EI/ECSE SPR&I Training ODE Fall 2007.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
CHANGING ROLES OF THE DIAGNOSTICIAN Consultants to being part of an Early Intervention Team.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Targets & Improvement Activities State Improvement Planning Carol Massanari, MPRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Beth Rous University of Kentucky Working With Multiple Agencies to Plan And Implement Effective Transitions For Head Start Children Beth Rous University.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Evaluation. Practical Evaluation Michael Quinn Patton.
Part B Indicator 13 FFY 09 SPP/APR Writing Suggestions Western Regional Resource Center APR Clinic 2010 November 1-3 San Francisco, California.
Performance Monitoring All All Contracts require basic monitoring once awarded. The Goal of contract monitoring is to ensure that the contract is satisfactorily.
Special Ed. Administrator’s Academy, September 24, 2013 Monitoring and Program Effectiveness.
A Model for Collaborative Technical Assistance for SPP Indicators 1, 2, 13, & 14 Loujeania Bost, Charlotte Alverson, David Test, Susan Loving, & Marianne.
Verification Visit by the Office of Special Education Programs (OSEP) September 27-29, 2010.
Reporting and Using Evaluation Results Presented on 6/18/15.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
California Stakeholder Group State Performance and Personnel Development Plan Stakeholders January 29-30, 2007 Sacramento, California Radisson Hotel Welcome.
Local Contributing Factor Tool for SPP/APR Compliance Indicators C-1, C-7, C-8, C-9/B-15, B-11 and B-12: Collecting and Using Valid and Reliable Data to.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
OSEP National Early Childhood Conference December 2007.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Significant Changes to the Monitoring Process  Self-assessment by school districts.  Greater involvement of parents and other stakeholders.  Improved.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
PERSONNEL EVALUATION SYSTEMS How We Help Our Staff Become More Effective Margie Simineo – June, 2010.
SPR&I: Changes, New Measures/Targets, and Lessons Learned from Focused Monitoring Visits David Guardino, SPR&I Coordinator Fall 2009 COSA Conference.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
1 General Supervision. 2 General Supervision (and Continuous Improvement) 1.What are the minimum Components for General Supervision ? 2.How do the Components.
An Introduction to the State Performance Plan/Annual Performance Report.
State Performance Plan (SPP) Annual Performance Report (APR) Dana Corriveau Bureau of Special Education Connecticut State Department of Education ConnCASEOctober.
Continuous Improvement and Focused Monitoring System US Department of Education Office of Special Education Programs Overview of the OSEP Continuous Improvement.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
1.  Mapping Terms  Security Documentation  Predictor Table  Data Discussion Worksheet 2.
Texas State Performance Plan Data, Performance, Results TCASE Leadership Academy Fall 2008.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Dan Schreier, Gregg Corr, Jill Harris, Ken Kienas, Kate Moran,
WLUSA/OSSTF Annual Performance Review Process Human Resources & WLUSA| 2015.
Internal Auditing ISO 9001:2015
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
1 Collaboration Across Part C and 619 on Child Outcomes Measuring Child and Family Outcomes.
O S E P Office of Special Education Programs United States Department of Education Aligning the State Performance Plan, Improvement Strategies, and Professional.
1 Early Intervention Monitoring Wyoming DDD April 2008 Training.
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
6/18/2016 DES / AzEIP 2011 Cycle Two Self Report Overview & Training Cycle Two Self Report Overview & Training.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
April 29-30, Review information related to the RF monitoring system Ensure that the agency meets its ongoing obligation to have a monitoring system.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
North Carolina Council on Developmental Disabilities
G-CASE Fall Conference November 14, 2013 Savannah, Ga
SPR&I Regional Training
Title I Program Monitoring: Common Findings from FY18
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
How to conduct Effective Stage-1 Audit
North Carolina Council on Developmental Disabilities
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Using Data to Build LEA Capacity to Improve Outcomes
Presentation transcript:

2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 State Monitoring Under IDEA A Snapshot of Past Practices and a Framework for the Future Julie Bollmer and Marsha Brauen, Westat Presentation EB 1

Study objectives and historical context Overview of framework for state monitoring systems Methodology for site visit data collection Discussion of each framework component Summary Questions/discussion 2 Overview of Presentation

Study O bjectives and Historical Context 3

Provide a description of the nature and scope of states’ monitoring systems Describe states’ monitoring systems at two points in time Create a framework to describe state monitoring systems 4 Study Objectives

State monitoring systems in and were the focus of the study. Systems were influenced by: – Federal legislation – OSEP’s monitoring systems – SPP and APR implementation. 5 Historical Context of the Study

F ramework for State Monitoring Systems 6

Development of Framework7 No requirement that states use a particular monitoring approach Framework used to help describe the variation in state monitoring systems Framework incorporated important activities of state monitoring systems Used various sources to create the framework Multiple reviews by Advisory Panel

8 Framework for State Monitoring

Methodology for Site Visit Data Collection 9

Random sample of 20 states  Two rounds of data collection First round  Second round   Conducted semi-structured interviews with state staff, local staff, stakeholders  Reviewed states’ documentation on monitoring activities 10 Site Visit Data Collection

Single or Multiple Processes Number of Monitoring Processes and States with Multiple Monitoring Processes Number Part BPart C (N=20) (N=20) (N=20) (N=20) Monitoring processes across states States with more than one process

Discussion of Each Framework Component 12

Problem Identification Problem Identification: Comparing performance (e.g., on a specific indicator) to an expectation (e.g., a target established for that indicator) and detecting possible deficiencies. – Indicator and Target Setting – Indicator Data Collection and Analysis – Problem Detection 13

Problem Identification14

Examples for Problem Identification: – Stakeholder committee includes parents of children with disabilities or representatives from advocacy groups. – Target setting is accomplished through a systematic process. – Written documentation exists describing the data collection methodologies, including site visit. – Procedures are in place to monitor the quality of the data collected. – Findings reflect performance in relation to specific targets. – State-level problem identification reports are disseminated to stakeholders. 15

Problem Identification Percentage of Part B and Part C Monitoring Processes That Had Problem Identification 16

17 How indicators selected Part BPart C (N=34) (N=32) (N=24) (N=28) By state, with stakeholder input21854 By state, without stakeholder input By state, based on SPP/APR†16†19 Problem Identification Number of Monitoring Processes That Used Various Approaches to Select Indicators † Not applicable. Because the 2004 amendments to IDEA, which require states to submit SPPs/APRs, were not enacted until December 2004, this approach was not used by states when selecting their indicators for

Problem Investigation Problem Investigation: Exploring why an identified problem exists and whether it is systemic or localized/isolated 18

Problem Investigation Examples for Problem Investigation: – Materials used to seek input from stakeholders are tailored to improve understanding and facilitate their contribution to the problem investigation process. – Individuals conducting problem investigations have appropriate training. – Procedures are in place to minimize interruptions/ disruptions to school, program, or district routines when collecting data. – Data are verified for accuracy. – Findings focus directly on the identified problems under investigation. 19

Problem Investigation Percentage of Part B and Part C Monitoring Processes That Had Problem Investigation 20

21 Approach Part BPart C (N=16) (N=17) (N=8) (N=8) Same approach for all problems10 75 Tailored, based on nature of problem6713 Problem Investigation Number of Monitoring Processes in Which Various Problem Investigation Approaches Were Used

Corrective Action and Enforcement Corrective Action and Enforcement: Correcting noncompliance by making immediate changes to documentation, procedures, or practices 22

Corrective Action and Enforcement Examples for Corrective Action and Enforcement: – CAPs specify all problem areas. – Steps/components of the CAPs are clearly delineated for each identified problem. – Timelines are included for each step/component of the CAPs. – CAPs are disseminated to local stakeholders beyond the LEA/EIS program administrators. – Opportunities are provided for LEA/EIS program personnel to discuss CAPs with SEA/lead agency staff. – The SEA/lead agency has written general enforcement procedures. 23

Corrective Action and Enforcement Percentage of Part B and Part C Monitoring Processes That Had Corrective Action and Enforcement 24

25 Follow-up approach Part BPart C (N=32) (N=32) (N=21) (N=23) General communication10941 Submit evidence of implementation51119 Progress reports Site visits, meetings, data reviews None (did not follow up)8431 Corrective Action and Enforcement Number of Monitoring Processes in Which Various Follow-up Approaches Were Used to Ensure Corrective Action Plan Implementation

Improvement Planning and Implementation Improvement Planning and Implementation: Developing and implementing strategies for improving systemic performance or reducing the potential for noncompliance – Improvement Planning – Improvement Plan Implementation 26

Improvement Planning and Implementation27

Examples for Improvement Planning and Implementation: – Input is sought from parents of children with disabilities or representatives from advocacy groups. – Written improvement plans address each problem area. – Plans describe the ways that implementation will be monitored by the SEA/lead agency. – Documentation confirms that plans were followed or changes in plans justified. – Meetings are held with SEA/lead agency to discuss local implementation. 28

Improvement Planning and Implementation Percentage of Part B and Part C Monitoring Processes That Had Improvement Planning and Implementation 29

30 Follow-up approach Part BPart C (N=23) (N=21) (N=11) (N=12) General communication6233 Submit evidence of implementation1302 Progress reports51467 Site visits, meetings, data reviews81066 None (did not follow up)8332 Improvement Planning and Implementation Number of Monitoring Processes in Which Various Follow-up Approaches Were Used to Ensure Improvement Plan Implementation

Reassessment Reassessment: Checking to see whether a corrective action plan or an improvement plan has been effective 31

Reassessment Examples for Reassessment: – Reports describe the data used to reassess performance/compliance on specified indicators and targets. – Opportunities are provided for LEA/EIS program personnel to discuss reassessment reports with SEA/lead agency staff. – Findings from reassessments are disseminated to state-level stakeholders. 32

Reassessment Percentage of Part B and Part C Monitoring Processes That Had Reassessment 33

34 Reassessment Number of Monitoring Processes in Which Reassessments Were Conducted by LEA/EIS Program Staff and/or State Agency Staff Who conducted reassessments Part BPart C (N=11) (N=8) (N=12) (N=9) LEA/EIS program staff only3231 State agency staff only6657 Both LEA/EIS program staff and state agency staff 2041

Summary 35

Framework has not been evaluated to look at whether its adoption will improve compliance with IDEA and outcomes for students with disabilities; limits the conclusions that could be drawn from the study (e.g., did not look at change over time or trends). The framework is not the only one that could be developed to represent state monitoring systems. Retrospective information was collected 36 Limitations of the Study

State monitoring systems were not static; in both years, a percentage of them were in transition Many factors influenced states’ monitoring systems Substantial variability in the nature and design of states’ monitoring systems In both years, few Part B or C monitoring processes included all five framework components 37 Key Findings

In both years, Problem Identification and Corrective Action and Enforcement were the most common framework components included in state monitoring processes States varied in the way they carried out each component and the degree to which elements were present or absent 38 Key Findings

Reports from this study are available on the IES website: The database is restricted-use and will need to be obtained from IES Marsha Brauen Julie Bollmer Rob Ochsendorf 39 Want More Information?