1 Early Intervention Monitoring Wyoming DDD April 2008 Training.

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

(Individuals with Disabilities Education Improvement Act) and
What do the Federal Regulations Require?. The federal regulations have been revised to include a number of new systems/reports that are intended to drive.
A Multi-Year Improvement System and Schedule
Integration of State Planning and Reporting Functions Using Indistar® Indistar® Summit March 24-25, 2014 Office of School Improvement Virginia Department.
5/2010 Focused Monitoring Stakeholders May /2010 Purpose: Massachusetts Monitoring System  Monitor and evaluate program compliance with federal.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 State Monitoring Under IDEA A Snapshot of Past Practices.
1 Determinations EI/ECSE SPR&I Training ODE Fall 2007.
Correction of Non-Compliance Prior to Notification Monitoring and Supervision March 11, 2013.
Potpourri: Summary of Important Points to Remember Presenters: Jill Harris Laura Duos NOVEMBER 2011.
Refresher: Background on Federal and State Requirements.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
Special Ed. Administrator’s Academy, September 24, 2013 Monitoring and Program Effectiveness.
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
First, a little background…  The FIT Program is the lead agency for early intervention services under the Individuals with Disabilities Education Act.
OSEP National Early Childhood Conference December 2007.
Local Contributing Factor Tool for SPP/APR Compliance Indicators C-1, C-7, C-8, C-9/B-15, B-11 and B-12: Collecting and Using Valid and Reliable Data to.
Systems Performance Review & Improvement (SPR&I) Training Oregon Department of Education Fall 2007.
Objectives: 1) Participants will become familiar with General Supervision Monitoring Plan Section of the Kansas Infant Toddler Services Procedural Manual.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
CALIFORNIA DEPARTMENT OF EDUCATION Jack O’Connell, State Superintendent of Public Instruction State Performance Plan (SPP) & Annual Performance Report.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
A Review of the Special Education Integrated Monitoring Process BIE Special Education Academy September 12-15, 2011 Tampa, Florida.
OSEP National Early Childhood Conference December 2007.
Welcome to the Regional SPR&I trainings Be sure to sign in Be sure to sign in You should have one school age OR EI/ECSE packet of handouts You.
1 Supplemental Regulations to 34 CFR Part 300 Assistance to States for the Education of Children with Disabilities and Preschool Grants for Children with.
Procedures and Forms 2008 FRCC Compliance Workshop April 8-9, 2008.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
Significant Changes to the Monitoring Process  Self-assessment by school districts.  Greater involvement of parents and other stakeholders.  Improved.
STATE MONITORING VISIT Montgomery County Schools Week of April 18, 2016.
BIE Special Education Academy September 2011 Tampa Bay, Florida Presenter: Donald Griffin Education Specialist, Special Education Bureau of Indian Education.
SPR&I: Changes, New Measures/Targets, and Lessons Learned from Focused Monitoring Visits David Guardino, SPR&I Coordinator Fall 2009 COSA Conference.
1 General Supervision. 2 General Supervision (and Continuous Improvement) 1.What are the minimum Components for General Supervision ? 2.How do the Components.
An Introduction to the State Performance Plan/Annual Performance Report.
State Performance Plan (SPP) Annual Performance Report (APR) Dana Corriveau Bureau of Special Education Connecticut State Department of Education ConnCASEOctober.
Letter of Explanation Copy of Data Disproportionality Initial Eligibility 60-day Timeline Early Childhood Transition Secondary Transition Corrected and.
1 Using Data for Program Improvement New Hampshire Presenter: Carolyn H. Stiles Part C Coordinator/Program Coordinator Family Centered Early Supports and.
Arizona Early Intervention Program (AzEIP) Team-Based Early Intervention Services Overview for Administrators ADMINISTRATIVE.
Continuous Improvement and Focused Monitoring System US Department of Education Office of Special Education Programs Overview of the OSEP Continuous Improvement.
On Site Review Process Office of Field Services.
IDEA 2004 Part B Changes to the Indicator Measurement Table.
Improvement Planning Mischele McManus Infant/Toddler and Family Services Office of Early Childhood Education and Family Services July 20, 2007
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
Noncompliance and Correction (OSEP Memo 09-02) June 2012.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Dan Schreier, Gregg Corr, Jill Harris, Ken Kienas, Kate Moran,
Determinations Mischele McManus July 20, 2007
In accordance with the Individuals with Disabilities Education Act and Chapters 14 and 15 of the State Board Regulations, PDE provides general supervision.
1 Assuring the Quality of Data from the Child Outcomes Summary Form.
Special Education Performance Profiles and SPP Compliance Indicator Reviews Office for Exceptional Children.
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
6/18/2016 DES / AzEIP 2011 Cycle Two Self Report Overview & Training Cycle Two Self Report Overview & Training.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA General Supervision.
Understanding the Data on Preschool Child Find and Transition Annual Performance Report Indicator 12 February, 2016
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
Part C Data Managers — Review, Resources, and Relationship Building
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Monitoring Child Outcomes: The Good, the Bad, and the Ugly
SPR&I Regional Training
Early Childhood Transition APR Indicators and National Trends
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Refresher: Background on Federal and State Requirements
Special Ed. Administrator’s Academy, September 24, 2013
Presentation transcript:

1 Early Intervention Monitoring Wyoming DDD April 2008 Training

2 Background

3 Why revise the GSS? To ensure implementation of IDEA To identify and correct noncompliance in a timely manner To facilitate program improvement To improve results and functional outcomes for children and families

4 Key Principles 1.Limited indicators; consistently used and closely aligned with results for children/families 2.Data used throughout the year to identify emerging issues; preventative TA 3.Data system responds to most indicators; other data sources as needed

5 Key Principles 4.Annual off-site analysis for all programs 5.Monitoring data used for APR 6.Onsite conducted with programs in greatest need 7.Quarterly activities ensure timely, accurate data

6 Stakeholder Process Initial stakeholder meeting – April 2007 Local perspectives Review and feedback on materials Pilot monitoring process

7 Components of GSS

8 1.State Performance Plan/Annual Performance Report (SPP/APR) 2.Indicators for Monitoring Regional Programs 3.Wyoming Part C Rules 4.Wyoming Part C Policies and Procedures

9 Components of GSS 5.Interagency Agreements 6.Contracts with Regional Programs 7.Complaints/Dispute Resolution System 8.Off-site and onsite monitoring activities

10 Components of GSS 9.Training and TA System 10.Corrective Action Plans 11.Incentives and Sanctions

11 Indicator Measurement Tool

12 Indicator Measurement Table Indicators for Monitoring Regional Programs 1-10 come from SPP/APR requirements come from stakeholder group discussion of priority indicators for WY DDD

13 Indicator Measurement Table Timely Services Natural Environments Child Outcomes Family Outcomes 0-1 Child ID 0-3 Child ID 45 day timeline Transition Timely correction of noncompliance Timely submission of Data

14 Indicator Measurement Table - Indicators Quality Evaluation/ Assessment Procedure Safeguards IFSP Service Provision Timely IFSP Meetings Quality IFSPs Developmental Status on IFSPs Clinical Opinion Qualified Personnel

15 Indicator Measurement Table – Data Sources 618 Data Data System Reports Annual Self Assessment CAP Tracking Log COSF data Previous Monitoring Reports Data and Reports Submission Tracking Log Personnel Report Family Survey Data

16 Indicator Measurement Table - Format Indicators Data Sources Measurement Target

17 Indicator Measurement Table – Use of Data To report on the SPP/APR To identify low performance and noncompliance To trigger the development of CAPs To make status determinations To select programs for onsite visits

18 Data System

19 Data Procedures Programs establish data procedures, e.g. –data entry person and responsibilities –data accuracy/reliability process –process to respond to Request for Data Clarification –process for using data reports to identify issues, determine training/TA needs, track

20 Data Entry Data entry expectations –Child-specific IFSP data by the 10 th of each month –COSF data 30 days following completion of COSF form

21 Data Verification Data verification –Check data for completeness and accuracy –Correct data as necessary –Make records available during onsite visits –Submit copies of IFSPs, COSFs

22 Report Generation Quarterly to review data entry for accuracy and reliability Quarterly to identify potential issues that may require training and/or TA Ongoing for tracking progress in improving performance and noncompliance

23 Annual Self Assessment

24 Annual Self Assessment –Timely services (#1) –Transition steps and conference (#8a, #8c) –Evaluation/ assessment in all areas (#11b) –Procedural safeguards (#12a, #12b) –Services provided by qualified personnel (#13a, #13b) –Measurable, functional IFSP (#15a-d) –Clinical Opinion (#17) Data for some, not all, indicators:

25 Annual Self Assessment State disseminates in May each year and provides training and TA Regional programs review 10% or 10 records (whichever is more) identified by state Programs submit to state by June 30 th

26 Quarterly Self Assessment Quarterly record reviews using self- assessment items Random selection of child records provided by the state (10% or 10 records) Submitted to the state and used to identify emerging issues for training and TA

27 Off-site and Onsite Monitoring Activities

28 Off-site Monitoring To monitor all programs annually To identify emerging issues and plan TA To verify data To track progress

29 Annual Desk Audit Conducted in July and August each year Allows monitoring of all regions annually without going onsite State issues a ‘Report Card’ to each region re: their performance on the indicators; requires confirmation of data

30 Annual Desk Audit Data system reports 618 data Annual Self Assessment data Family survey data COSF data Complaints and dispute data Previous monitoring reports Previous CAPs Personnel Report CAP Tracking Log Data and Report Submission Log

31 Annual Desk Audit The state uses the data to: –Identify noncompliance (and possible CAPs) –Make status determinations –Select sites for onsite monitoring –Notify programs of findings and decisions –Respond to the SPP/APR due in February

32 Identification of Noncompliance and Low Performance Noncompliance –Compliance Indicators –100% target –‘finding’ requires CAP –Correction of noncompliance within 1 year –Tracking correction and reporting in SPP/APR

33 Identification of Noncompliance and Low Performance Noncompliance –Individual child instances are not a ‘finding’ and do not require CAP. However, the program is still required to correct these instances and report the correction.

34 Identification of Noncompliance and Low Performance Low performance –Performance Indicators –Target set by stakeholders –Substantially less than the state target requires a CAP

35 Status Determinations Data analysis from Desk Audit Four Status Determination Categories: –Meets Requirements –Needs Assistance –Needs Intervention –Needs Substantial Intervention

36 Onsite Selection 3 regional programs per year –2 based on greatest need –1 randomly selected All regions receive an onsite within a 5 year period

37 Notification Letter State notifies region in writing re: –Noncompliance –Need for CAP –Status determination category –Onsite selection

38 Ongoing, Preventative Activities Quarterly record reviews Quarterly conference calls and/or meetings to discuss data Specific TA, as needed, statewide or region-specific

39 Other Off-site Activities Reviewing and approving CAPs Tracking progress and correcting noncompliance Providing TA and training Releasing programs from noncompliance Providing rewards and sanctions

40 Onsite Monitoring Programs in greatest need To determine the underlying reasons that contribute to noncompliance and/or low performance

41 Onsite Visit Preparation Initial Conference Call Selection of Onsite Review Team Onsite Review Team Orientation/Training Data Analysis Selection of Root Cause Analysis Tools Onsite Visit Planning Calls Onsite Review Team Assignments

42 Conducting Onsite Visits Entrance Meeting Data Collection Data Verification Analyses of Data Collected Onsite Reporting Results Planning targeted TA Issuing a Findings Report

43 Pilot Onsite Visits Regions share experiences with onsite monitoring process –General comments about how it went –Advice to other regions

44 Monitoring Team Invite input regarding the Monitoring Team –Who is the team? –How can individuals become part of the team? –What is the selection process? Application process? –What is the training process?

45 End of Day 1

46 Developing Corrective Action Plans

47 Development of CAPs Findings of noncompliance and low performance require the development of a CAP  Remember: Noncompliance must be corrected within 1 year of identification (in writing)

48 Steps related to CAPs: Desk Audit Review findings of noncompliance, areas of low performance and evidence of change expectations provided by the state Convene a team of knowledgeable staff/ providers to conduct root cause analysis Identify improvement activities

49 CAP Root Cause Areas Corrective Action StrategiesResponsibleTimeline Policy & Procedures Funds T & TA Supervision Personnel Practices

50 Steps related to CAPs Request TA, as needed, related to root cause analysis, meaningful strategies, etc. Submit CAP within 30 days of written identification Modify CAP, if necessary, to meet state approval Receive targeted TA

51 Steps related to CAPs Collect and submit CAP data according to timelines Review progress data and modify CAP, if necessary Use TA, as needed, to implement CAP Review written notification of release from CAP

52 Developing a good CAP Scenario Activity

53 CAP Checklist Designed for reviewing CAP to assure plan can serve its intended purpose of guiding needed systems change and fostering continuous improvement

54 Tracking Progress CAP Tracking Log One year deadline for timely correction State may impose changes to CAP State may impose targeted TA State will release from CAP when corrected

55 Training and Technical Assistance

56 Training and TA Review quarterly data reports with state to identify training and TA needs Attend/access statewide training and TA Request and access regional training and TA

57 Incentives and Sanctions Incentives, e.g. public recognition Enforcement actions, e.g. –Directing use of funds to correct noncompliance –Imposing special conditions on the contract –Denying or recouping payment for services for which noncompliance is documented –Terminating or not renewing the contract

58 Disputes and Complaints Regional programs try to resolve complaints or disputes informally Now required to track informal complaints/disputes Informal Complaint Tracking Log is submitted to the state annually with contract Tracking informal complaints will help with program improvement

59 Meeting Federal Requirements

60 Federal Requirements OSEP requirements for monitoring APR indicators and WY priority indicators Reporting to the public

61 Status Determinations 1.Meets Requirements 2.Needs Assistance 3.Needs Intervention 4.Needs Substantial Intervention

62 Meets Requirements Demonstrates substantial compliance (95%) on all compliance indicators (Indicators 1, 7 and 8) All indicators, including performance indicators, have valid and reliable data (actual target data, baseline data, etc.) Timely correction of noncompliance identified through monitoring or other means (Indicator 9)

63 Needs Assistance Not demonstrating substantial compliance (95%) on one or more of compliance indicators (Indicators 1, 7, and 8) One or more indicators, including performance indicators, do not have valid and reliable data (actual target data, baseline data, etc.) Not demonstrating timely correction of noncompliance (Indicator 9)

64 Needs Assistance 2 consecutive years in needs assistance –Advise of available sources of technical assistance –Direct use of funds –Identify as a high-risk grantee and impose special conditions on the contract.

65 Needs Intervention Not demonstrating substantial compliance (95%) on one or more of compliance indicators and not making significant progress in correcting noncompliance previously identified on those indicators One or more indicators, including performance indicators, are missing valid and reliable data and not making significant progress in correcting previously identified data problems Not demonstrating timely correction of noncompliance and not making significant progress in correcting that noncompliance

66 Needs Intervention 3 consecutive years in Needs Intervention –Require development of CAP –Require “compliance agreement” if cannot correct within one year (3 years to correct) –Withhold percentage of funds –Recover funds –Withhold future payments –Impose other enforcement actions

67 Needs Substantial Intervention The failure to substantially comply significantly affects the core requirements of the program, such as the delivery of services to children with disabilities; and/or The Regional Program has informed the State that it is unwilling to comply

68 Needs Substantial Intervention At any time – –Recover funds –Withhold further payments –Legal action such as discontinue contract –Refer matter for appropriate enforcement action

69 Contracts

70 Summary of Expectations