Download presentation
Presentation is loading. Please wait.
Published byAmie Lee Modified over 9 years ago
1
1 2007 OSEP National Early Childhood Conference December 2007
2
2 NJEIS GENERAL SUPERVISION SYSTEM
3
3 NJEIS FUNDING TRENDS
4
4 Central Management Office (Data Collection) Data Desk Audit & Inquiry Self-Assessment Focused On-site Monitoring Targeted Technical Assistance Procedural Safeguards/Dispute Resolution Enforcement GENERAL SUPERVISION NJEIS COMPONENTS
5
5 NJEIS INFRASTRUCTURE Lead Agency-Quality Assurance Team Contracts Procedural Safeguards Central Management Office Monitoring Personnel Development Regional Early Intervention Collaboratives (REICs - 4) Service Coordination Units (SCUs - 21) Early Intervention Programs (EIPs - 80+)/Practitioners (3500+) Targeted Evaluation Teams Comprehensive Programs Service Vendors
6
6 GENERAL SUPERVISION ACTIVITIES TO ENSURE COMPLIANCE To ensure ongoing compliance and timely response to emerging issues data are reviewed periodically as follows: Service Coordination Units (County SPOE) review weekly; and Regional Early Intervention Collaboratives review monthly. NJEIS provides technical assistance as appropriate if issues are identified through ongoing reviews.
7
7 CENTRAL MANAGEMENT OFFICE (CMO)
8
8 COVANSYS CMO The Covansys System is designed specifically for early intervention It has been in use for over 10 years. Currently there are 4 States using the system who are actively developing and making improvements to the software
9
9 CMO FEATURES Child Specific Data Collection State access to timely statewide data Local Access to Data Data Verification (Accuracy) Provides Accountability Timely system of payment Maximization of funding resources Supports Monitoring Personnel Enrollment/Matrix Reports
10
10 CHILD SPECIFIC DATA COLLECTION 45 days 1 year 1 year 1 year 45 days 1 year 1 year 1 year ReferralInitial IFSP Annual IFSP Annual IFSP Transition Entry Exit Timely Services
11
11 ACCOUNTABILITY FEATURES Child must be eligible for Early Intervention Child must have an Active IFSP to receive authorization for services Practitioners must pass a credentialing process where their experience and licenses are verified Explanation of Benefits sent to family Billing authorizations are created based on a completed IFSP and ensure services are being provided according to the IFSP
12
12 DATA VERIFICATION REICs are responsible for entering the IFSP information into SPOE which provides on going accountability (Oops tickets). Paperflow REIC Data entry Funky Data Inquiry Process Service and practitioner’s specialty is matched to ensure practitioner is qualified Data Verification On-site Visit
13
13 CMO SUPPORTS MONITORING The CMO provides standard and customized reports that support: Federal reporting requirements including 618, SPP, and APR Quality assurance of Federal and State performance and compliance requirements Analysis of child outcome data Tracking progress improvement and correction Reporting to state, stakeholders and public
14
14 CMO REPORTS Days from Referral to Initial IFSP (percent of IFSP that took place >45 days from the referral date). Days from IFSP Meeting to start of services (percent of services starting >30 days from the IFSP meeting date). Percent of IFSP services provided in other than natural environment. Frequency of periodic reviews, and that the review takes place within 6 months of the IFSP Start Date. The timing of Annual IFSP meetings and the percent that exceed the 12 months. Transition Planning Conference within 90 days of a child turning 3 years old. Children that have exited the system and the reason they transitioned out of NJEIS.
15
15 DATA DESK AUDIT, INQUIRY & CORRECTIVE ACTION PLAN
16
16 IDENTIFICATION & CORRECTION OF NONCOMPLIANCE Review of Data Agency Performance Inquiry (Off-site) Analysis of Agency Response to Inquiry Lead Agency Determination of Noncompliance Development of Corrective Action Plan (CAP) Correction of Noncompliance within 12 months
17
17 OFFSITE PERFORMANCE INQUIRY Existing performance data on required indicator are sent to agencies by state office; Agency reviews the data and responds to a series of questions in 10–15 business days Data verification (clean-up missing/incorrect information (dates) What was the reason for each delay? Has the delay since been corrected? What barriers contribute to the poor performance? What was the response and/or correction to barriers What is being done to improve performance?
18
18 DATA VERIFICATION (Clean-up) Monitoring Team Data Review Matrix Lead Agency Data Verification Regional Data Clean-up Lead Agency Desk Audit Additional local data verification & clean-up during inquiry
19
19 NJEIS data desk audits are conducted annually by lead agency staff to monitor timely transition planning conference timelines. A TPC timeline data run of all children turning three is conducted on all twenty-one counties. Additional information is obtained as necessary from county agencies through an inquiry process. NJ’s twenty-one counties are ranked based on co-hort size (small, medium, large). Findings of non-compliance are determined and corrective action plans (CAPs) are developed including required evidence of change. NJEIS provides technical assistance, monitors correction of the non-compliance and ensures correction within one year of identification of the non- compliance to the county. Each year NJEIS identifies focused areas for on-site monitoring based on statewide compliance and performance data.
20
20 Desk Audit Transition Planning Conference (TPC)
21
21 DETERMINATION OF NONCOMPLIANCE Inquiry data are reviewed by state office looking at agency submitted reasons for delay. Agency is not held accountable for delays related to family reasons. State determines if agency should be issued a finding of noncompliance. Is yes, agency is notified of noncompliance with federal requirement.
22
22 Data Desk Inquiry
23
23 Data Desk Inquiry (continued)
24
24 TPC (2006-2007) State Performance The desk audit identified 76.1% of the child records reviewed received a TPC. After inquiry to the local counties, 96% of these child records documented that a TPC meeting occurred within 90 days prior to the child’s third birthday. Inquiry identified missing or incomplete data in the database, family reasons for delays or decline from having TPC.
25
25 TPC (2006-2007) Local County Performance A desk audit for one county identified that 60.4% of the children reviewed had a TPC meeting. After inquiry to this county, 85% of these child records documented that a TPC meeting occurred within 90 days prior to the child’s third birthday. Inquiry identified missing or incomplete data in the database, family reasons for delays or decline from having TPC. A CAP was issued to program to correct noncompliance
26
26 LOCAL AGENCY CAP INCLUDES Required evidence of change with dates for required reporting Activities to help with improvement TA available as needed Change is required!!!! Letter sent to agency as soon as CAP is successfully completed
27
27 CAP REQUIRED EVIDENCE OF CHANGE Baseline – 85% of children had a TPC 90 days prior to their third birthday Target Day 30 – 85% of children Target Day 60 – 90% of children Target Day 90 – 95% of children Target Day 120 – 100% of children
28
28 Corrective Action Plan (CAP)
29
29 COMPLIANCE NOT ACHIEVED WITHIN ONE YEAR OF FINDING If an agency completing the monthly data requirements, does not demonstrate 100% compliance near the 12 month timeline, the agency is required to submit written explanation of why 100% compliance was not achieved. The lead agency reviews, requests further information as needed and issues additional sanctions. Sanctions include: Ongoing monthly reporting. Identification of measurable activities that will impact on improvement toward correction. State directed training and/or technical assistance. The lead agency may request an on-site focused visit. Place the agency under “at risk” or “special conditions”. Reduce or withhold funding.
30
30 SELF-ASSESSMENT
31
31 SELF-ASSESSMENT DHSS-NJEIS contracts include a requirement that NJEIS provider agencies submit an annual self- assessment report. Facilitates supervision through monthly record review and practitioner observation requirements. Provides transition indicators 8a and 8b data currently unavailable through the CMO data system. Self identified improvement planning is expected to proactively remedy performance concerns. Timely and accurate reporting is tracked for local performance reporting and determination.
32
32 ON-SITE FOCUSED MONITORING
33
33 ONSITE MONITORING Decisions to conduct on-site focused monitoring visits may be made under the following circumstances As needed, based on incident reports or procedural safeguards complaints; As needed, based on concerns identified through on-going review of system point of entry (SPOE) or self-assessment data; and Based on ranked performance data related to priority indicators.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.