2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Using Part C Data Analytic Modules for Program Improvement.

Slides:



Advertisements
Similar presentations
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Advertisements

Care Coordinator Roles and Responsibilities
MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
A Multi-Year Improvement System and Schedule
P re-Referral and Referral Activities Produced by NICHCY, 2014 Module 3.
JUVENILE JUSTICE TREATMENT CONTINUUM Joining with Youth and Families in Equality, Respect, and Belief in the Potential to Change.
5/2010 Focused Monitoring Stakeholders May /2010 Purpose: Massachusetts Monitoring System  Monitor and evaluate program compliance with federal.
Transition from Part C to Part B in Louisiana (Session # S & 115)
Early Childhood Transition Forums Sponsored by the Massachusetts Department of Early Education and Care, Department of Elementary and Secondary Education,
Linking Actions for Unmet Needs in Children’s Health
Infant & Toddler Connection of Virginia 1 Virginia’s System for Determination of Child Progress (VSDCP)
July 2013 IFSP and Practice Manual Revisions April 29, 2013 May 3, 2013 Infant & Toddler Connection of Virginia Practice Manual Infant & Toddler Connection.
Building Local Capacity for Data Analysis and Use Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Infant & Toddler Connection of VA Haidee.
Family Resource Center Association January 2015 Quarterly Meeting.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
TIP Webinar Targeted Improvement Planning. ILCD EDN Guidance Document First document to review in preparation for your TIP development. The questions.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
Collaborative Mental Health Care Pilot Program Bidder’s Conference October 27, 2014.
Quality Management Update March 18, New Performance Improvement Project (1) Title: Controlled Substance Prescription Monitoring Program Database.
Administrator Checklist Research and Training Center on Service Coordination.
9/2/20151 Ohio Family and Children First An overview of OFCF structure, membership, and responsibilities.
OSEP National Early Childhood Conference December 2007.
National Association for the Education of Homeless Children and Youth Conference 2014 “What you talking about Willis: The Different Strokes of data sharing.
Background KICH was initially established after representatives from Kentucky Housing Corporation (KHC) participated in a Homeless Policy Academy in 2002,
Chase Bolds, M.Ed, Part C Coordinator, Babies Can’t Wait program Georgia’s Family Outcomes Indicator # 4 A Systems Approach Presentation to OSEP ECO/NECTAC.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
OSEP National Early Childhood Conference December 2007.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
Sarah Walters - Part C Coordinator KDHE Tiffany Smith - Part B ECSE Coordinator KSDE 1.
Understanding TASC Marc Harrington, LPC, LCASI Case Developer Region 4 TASC Robin Cuellar, CCJP, CSAC Buncombe County.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
1 Infant & Toddler Connection of Virginia Early Intervention System Presentation for Financing Systems Workshop OSEP National Early Childhood Conference.
HECSE Quality Indicators for Leadership Preparation.
Recruiting and Retaining Good Citizen Review Panel members The South Dakota Perspective.
Bureau of Indian Education Special Education Academy Using State and Local Data to Improve Results Sandy Schmitz, Ph.,D DAC Tampa, FL September ,
1 Family Resources and Supports Institute 2012 One Door: Early Start and Prevention Resource and Referral Services (PRRS) Susan Roddy, PRRS Director Sherry.
Building Clinical Infrastructure and Expert Support Michael Steinberg, MD, FACR ULAAC Disparity Project Centinela/Freeman Health System.
Massachusetts State Advisory Council (SAC) on Early Childhood Education and Care Review of Grant and Work Plan December
1 Charting the Course: Smoother Data Sharing for Effective Early Childhood Transition Wisconsin’s Journey Lori Wittemann, Wisconsin Department of Health.
1 Statewide Screening Collaborative July 30, 2013 Prevention Resource and Referral Services (PRRS) Susan Roddy, PRRS Project Director.
Office of Performance Review (OPR) U.S. Department of Health and Human Services (DHHS) Health Resources and Services Administration (HRSA) Stephen Dorage.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
2005 OSEP National Early Childhood Conference February 7, :00-12:30 “To Fee or Not to Fee: That is the Question” NEW JERSEY.
Improvement Planning Mischele McManus Infant/Toddler and Family Services Office of Early Childhood Education and Family Services July 20, 2007
CSEFEL State Planning Rob Corso. CSEFEL  National Center focused on promoting the social emotional development and school readiness of young children.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Striving Towards Excellence in Comprehensive Care: What do Children Need? July 10, 2007 Christopher A. Kus, M.D., M.P.H.
Early Childhood Transition: Effective Approaches for Building and Sustaining State Infrastructure Indiana’s Transition Initiative for Young Children and.
Iowa Council for Early ACCESS: Overview Vision: Every child, beginning at birth, will be healthy and successful Mission: Early ACCESS builds upon and provides.
Developing Strong Transition Protocols Infant Toddler Program, Head Start and Early Childhood Special Education Shannon Dunstan Idaho State Department.
Child & Family Connections #14. What is Child and Family Connections The Early Intervention Program in Illinois State funded program to assist families.
What Is Child Find? IDEA requires that all children with disabilities (birth through twenty-one) residing in the state, including children with disabilities.
Durham County Board of County Commissioners June 4, 2012.
1 Early Intervention Monitoring Wyoming DDD April 2008 Training.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
Understanding the Data on Preschool Child Find and Transition Annual Performance Report Indicator 12 February, 2016
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
1 This project was supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under.
DAC Data Analytics Using State and Local Data to Improve Results.
Screening and Monitoring Programs for Children Who are At Risk
Quality Case Practice Improvement
Part C State Performance Plan/Annual Performance Report:
The Process for Final Approval: Ongoing Monitoring
Using State and Local Data to Improve Results
Presentation transcript:

2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Using Part C Data Analytic Modules for Program Improvement A State and Local Perspective Haidee Bernstein, Data Accountability Center (DAC) Saundra Harrington, Local System Manager, Infant and Toddler Connection of Norfork Debra B. Holloway, The Arc of Virginia and VICC member Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Monitoring Consultant, Infant and Toddler Connection of Virginia Presentation # 219

Premises Data Use involves: Working through a Collaborative Team approach. Engaging Team in a Continuous Improvement Process. Relating the Data to specific Problem/Issue. Using Data is an Iterative Process!

Outcomes – Local Data Use Opportunity to learn about DAC work with states in assisting local agencies in using data for program improvement Orient participants to DAC's conceptual model of data analytics to identify root cause using the VA-C experience and draft modules

DAC’s Goal Form partnerships in states that join state and local agencies in the use of data to drive improved results

Discussion Questions 1.To what extent do local programs/agencies in your state use data effectively for improvement? 2.How does your state support and assist local programs/agencies in using data for improvement? 3.What is needed to ensure your local agencies/programs use data on an ongoing basis for continuous improvement?

Preparation Phase Step 1: Identify Relevant Data Module 1 - Identify relevant data to define/refine problem – Provide overview of data quality standards – Demonstrate how to visually represent data – Review available data – Identify relevant data based on defined/refined problem

Inquiry Phase Step 2: Conduct Data Analysis Module 2 – Conduct data analysis to generate hypothesis – Review relevant data – Define hypothesis – Analyze data – Develop analysis plan

Inquiry Phase Step 3: Test Hypothesis Module 3 – Test hypothesis to determine root cause(s) – Triangulate data – Discuss how to test hypothesis – Determine root cause(s)

Action Phase Step 4: Plan for Improvement Module 4 – Improvement planning – Discuss goal setting – Review basic components of an improvement plan – Give examples of good v. unacceptable components of improvement plan – Develop improvement plan

Action Phase Step 5: Evaluate Progress Module 5 – Evaluate progress – Discuss evaluation types, performance data & measures – Differentiate between efforts v. effects

Overall Goal of the Modules: Increase your ability to effectively use data on an ongoing basis for system management, oversight, and improvement. This goal can be accomplished by increasing your ability to understand, review, analyze, and use data proactively and reactively.

Why engage in this process Types of Data Module 1: Overview Define the purpose and the issue Identify the people and the process Identify relevant Data Module 2: Preparation Analyze the data Generate and Test Hypotheses Determine Actionable Causes Module 3: Inquiry Develop and Implement Improvement Plans Evaluate Progress Module 4: Action Case Study Module 5: Practice Guide

Who Can Benefit from These Modules? Local System Managers Program Coordinators Directors Quality Assurances Representatives Local Council Representatives

Steps in Preparation What is the issue? Who needs to be involved and how will they interact? What data are available and how will it be displayed? How long will it take?

Issue Description The issue should be a clear concise statement of the problem(s) that need to be addressed by the team.

Case Study: Let’s Define the Issue The local system is not meeting the state child find target of 2.6%. The local system is serving 1.46% of the 0-3 population, which is 56% of the state target. Define what do you want to address, resolve, or improve. Reactive = Purpose

When and for How Long Will the Team Meet? Affect the timelines Internal Deadlines External Deadlines Other Influences

Case Study: Plan for Team Interaction The team has decided to initially meet in- person at the end of March to discuss the issue and review the initial data together as a team. During May and June telephone conferences will be held every other Wednesday morning from 9 am to 10 am. The team will use exchanges to resolve issues that arise in-between meetings. In the beginning of July, the team will reconvene in-person. The team has set a two-year timeline to achieve success

Guide to Tackling the Data Reduce the amount of data to the most Relevant Information Understand that Relevant Information = information that is related to the problem or issue of concern Goal: Gather the evidence necessary to answer “why” the problem exists

Collecting the Available Relevant Data What data are available related to the issue? State? Local? Other data sources? How many years of trend data should be reviewed? What data should be included for comparison purposes? State ? Other locality data? How can the data be disaggregated? By age of child? By service coordinator? By a time period? Do we have qualitative as well as quantitative data that relate to the issue?

For Easy Analysis how should the Data be Displayed? Possible ways to display data include:  Column charts  Bar charts  Line charts  Pie charts  Maps

Column Chart: National and State Comparison of the Percent of Children Age 17 and below without Health Insurance in 2008 ( )

Getting Ready for Inquiry At this point the team has been assembled, schedule has been set, and all relevant and available data has been gathered. Now prepare all the materials including any visuals. Preparation is done. Let’s begin the analysis!

The Arc of Virginia Role of The Arc of Virginia Advocacy Training and Technical Assistance Family Involvement Affiliated with The Arc US 25 local chapters

Advocate for people with intellectual and related developmental disabilities and their families. Help people with disabilities, family members, friends, and others become advocates themselves. Advocacy

Virginia Interagency Coordinating Council December 9, 2009 VICC Meeting – APR Information presented to Stakeholder group – VICC focused on results of Indicator 5 and 6 Indicator 5: Child Find 0-1 – Percentage of children served decreased from previous year Indicator 6: Child Find 0-3 – Percentage of children served increased from previous year but did not meet State target State Targetnot met since initiation of SPP/APR

VICC VICC determined, in their role of advise and assist: – Focus on Child Find Requested Data from Part C Office Investigated what activities other States with “Broad” Definition of Eligibility were implementing Established Data Committee

Leadership Academy April 2010 – Meeting for Local System Managers and Infant Program Directors Addressed topics related to Supervision and Monitoring – Leadership – Use of Data – VICC Members participated in Data Analysis Sessions related to Child Find Obtained information about issues/efforts at local level

VICC Child Find has been an agenda item at VICC meetings and retreat since December Continued efforts to advise and assist: – Expansion of diagnosed conditions as automatic qualifier for EI services Prematurity – Data Review and Analysis Referral Sources Families Declining Services – Family Involvement Collaboration with local systems

The Arc of Virginia Outreach Coordinator Activities in Development – Foster local parent involvement All aspects of Part C Data Analysis Team – Family Perspective – Assistance with implementation of Improvement Plans – Analysis of Effectiveness of Improvement Plan Early Intervention Family Support Network Collaboration with local Arc chapters

Historical Perspective Virginia’s 2008 Determination Status Work with DAC

Historical Perspective Leadership Academy (April 2010) Data Analysis Modules

Pre On-Site Visit With Local System Managers Discuss purpose of data analysis process Discuss potential data team members Identify ITOTS reports to be reviewed Identify data from other sources that need to be reviewed Pull three years worth of data Desk Audit Review and analyze same data as local system Formulate questions about data Identify additional data that may need to be collected

First On-Site Visit PREPARATION Define the purpose and the issue Identify the people and the process Identify relevant Data

The Arc of Virginia Collaborate with families to represent their interests, issues perspectives and concerns. Facilitate family involvement and leadership in the early intervention system. Review and report local, state and national information related to family involvement and participation, parent-professional partnerships, family perspectives (including information from diverse educational, cultural, geographic and economic backgrounds). Assist with General Monitoring and Supervision activities.

Possible Team Members System Manager Quality Assurance Representative Administrators Parents/ Families Providers Service Coordinators Data Managers

Review Multiple Source of Data Data from interviews with families, service coordinators, EC community partners, etc. Record Reviews Part C, Local and State data systems Outside sources such as Juvenile Justice, Kids Count, and/or Health Departments During Inquiry Phase

Initial Visit Our initial visit consisted of the following team members: STATE: – State TA and Monitoring Consultants LOCAL: – Local System Manager – Program Coordinator; LSM Supervisor – 2 Service Coordinators – 1 compliance specialist (local)

Initial Data Review First visit was SHOCKING! Rather than move into “fixing” the low child count…we started to STUDY data The STUDY took two directions: – Data we could pull from our State database (ITOTS); national data; local data – Anecdotal or “soft data”

What Did the Initial Data Tell Us? We knew right away that our Race/Ethnicity of children served match the community We also knew that the number of children born premature or with low birth weights were appropriate as well We definitely needed more information about other issues….

We Need to Know More… How do you define a Substance Abuse Newborn Allegation count? – Secondary to who are these children is how as a state do we define “Substance Exposed / Effects of Toxic Exposure” Who are our referral sources…really? – Parents, Pediatricians, Hospitals and Department of Human Services (Social Services) are 4 main sources – Parents usually say their pediatrician told them to call – Are all pediatricians referring to us, or just some – Which hospitals – who at the hospital? Are we catching all of the NICU nurseries?

We Need to Know More… Evaluated vs. Not Evaluated – there was a higher level of children not evaluated than liked (43%) – Spike in “unable to contact” when procedures changed to close a referral after 20 days of no contact Hospitals and Pediatricians accounted for most of these referrals (62%) – many newborns who were premature we hypothesized – Why were families declining evaluation? Majority for DHS – are we explaining the differences between our program clearly – possible hypothesis Are those referral inappropriate – our DHS sends ALL children under 3, not just CAPTA referrals

What about our Anecdotal Data… Is the children’s hospital keeping all of their referrals in house until the insurance runs out? Why are we only 1/3 of the referrals to Part B?

Additional Data Needed Who are our referral sources….REALLY? Why are so many families going to Part B without coming to Part C first? – Is this due to the military population? Why are so many families unable to contact? Why are families declining evaluation / eligibility determination? – What are the reasons?

What we decided to DO Modified Referral Database to more accurately capture referral sources Added Secondary and Tertiary Referral Source data input boxes. More information gathered.

What We Decided to DO Worked with our Part B liaison who tracked data of children who were under 3 and referred to them prior to coming to Part C. Gather data related to referrals that are unable to contact and declining evaluation.

What We Decided to DO Utilized the TRACE materials and implemented a plan to visit our regular referral sources every quarter – In visiting our referral sources, we were able to engage in conversations about necessary contact information – Provided “logo” items, updated eligibility criteria and referral forms for each visit

Second On-Site Visit Inquiry Analyze the data Generate and Test Hypotheses Determine Actionable Causes

Generating a Hypotheses What actions in practice have contributed to local results in these areas: – Infrastructure – Policies and Procedures – Daily Practices – Monitoring and Supervision – Resources

Actionable Cause(s) To be sure we have achieved an actionable cause – Would the problem have occurred if the cause had not been present? – Will the problem re-occur if the cause is not corrected?

What We Found Referral Sources – We are primarily only seeing 3 of the major pediatric practices in Norfolk refer to our program. – There are 4 major pediatric groups we are not receiving referrals from: Boone Clinic Norfolk Pediatrics Tidewater Children’s Associates Associates in Pediatric Care

Questions We Ask Are these the pediatricians who tell the parents call them? More parent data needed.

Learning from our Data 20-Jan-11Department of Social Service (Non-CAPTA)CPSYolanda Murrell 10-Jan-11Department of Social Service (Non-CAPTA)Family Preservation UnitJuanita Jones 10-Jan-11Department of Social Service (Non-CAPTA)Family Preservation UnitJuanita Jones 20-Jan-11Department of Social Service (Non-CAPTA)CPSYolanda Murrell 19-Jan-11Department of Social Service (Non-CAPTA)CPSKeisha Williams 11-Jan-11Department of Social Service (Non-CAPTA)CPSTasha Whitfield 11-Jan-11Homeless ShelterHaven HouseRico Robinson 24-Jan-11Homeless ShelterDwelling PlaceCassandra Carter 05-Jan-11HospitalNMCPJan Henderson 05-Jan-11HospitalCHKDJan Odishoo 19-Jan-11HospitalCHKDJan Odishoo 26-Jan-11HospitalSentara Norfolk GeneralSteve Brown 11-Jan-11HospitalCHKDJan Odishoo 04-Jan-11Other CSBChesapeake ITCNLoren Wilee 26-Jan-11Parent/Guardian 12-Jan-11Parent/Guardian 10-Jan-11Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. Palmer 27-Jan-11Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. Palmer 24-Jan-11Pediatrician/Family Physician Group/PracticePediatric SpecialistsDr. Fink 25-Jan-11Pediatrician/Family Physician Group/PracticePediatric AssociatesDr. Cauley 14-Jan-11Pediatrician/Family Physician Group/PracticePediatric AssociatesDr. Shank 20-Jan-11Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. Sriraman 19-Jan-11Pediatrician/Family Physician Group/PracticeGen PedsDr. Brenner Now we are getting better information about who is referring from each referral source. Still working on understanding how families are learning about us.

Learning from our Data Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. HarringtonEnrolled to ITC Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. HarringtonEnrolled to ITC Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. HarringtonEnrolled to ITC Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. HarringtonEnrolled to ITC Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. HarringtonUnable to Contact Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. PalmerEnrolled to ITC Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. PalmerEnrolled to ITC Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. ShankUnable to Contact Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. SriramanEnrolled to ITC Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. SriramanDeclined Screening Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. SriramanUnable to Contact Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. SriramanDeclined Screening Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. SriramanDeclined Evaluation Pediatrician/Family Physician Group/PracticeGeneral Peds.Dr. SriramanUnable to Contact Newest Doctor at this practice, only has 1 in 7 referrals come into service. This pediatrician is a prime candidate to reach out to.

What We Found RE: Part B Referrals We found that there were 22 referrals in the year of children under 3. – 15/22 were from military families who were out of the area in the year prior – There were 8 pediatricians who referred to Part B rather than Part C – they will be targeted for child find activities

What We Found 34 Unable to Contact 1/1/11 – 4/30/2011 – 5 from Department of Health 2 EDHI 3 CHIP – 8 from Department of Social Services 7 different workers, all CPS – 1 from Relative – 6 from Hospitals 3 different referral sources; 4 referrals from 1 person – 14 from Pediatricians; 3 pediatric groups

What We Found 22 Declined Evaluation 1/1/11 – 4/30/2011 – 7 from Department of Social Services 5 different workers, all CPS – 3 from Homeless Shelters (all different) – 6 from Parents 2 referred from pediatrician – 6 from Pediatricians; 4 different doctors

What We Found FY 09: 34% of referrals did not move into service FY 10: 43% of referrals did not move into service (this was the data we based our original search on) FY 11: 33% of referrals did not go into service – this is a nice decline as we were in the process of diving into this data

ITC Norfolk Hypotheses  Children not referred at very young ages – average age at enrollment - 15 months  Some Physicians are not providing families with a complete explanation of early intervention and reason for referral  Families don’t always distinguish the difference between us and CPS  Some community pediatricians are not referring at all.

Final On-Site Visit Action Develop and Implement Improvement Plans Evaluate Progress

Final On-Site Visit Moving from Inquiry to Action Review Hypotheses Were we correct? Do we need to re-look at data to formulate new or additional hypotheses?

Final On-Site Visit Evaluating Progress Use data to determine if moving in right direction Improvement Planning Consider priorities Sphere of Influence Use data to determine if moving in right direction

ITC Norfolk Improvement Plan Plan to address increase rate of referrals coming into service:  Contact the referral source back when the contact information is incorrect  Notify physicians when the patients they refer decline evaluation  Continue to gather data from parents who refer their own children about how they heard about the program  Target the 4 major pediatric groups that are not referring to the program for intensive child find.

What’s Next? Develop mechanism to introduce to local systems Complete data analysis modules Analysis work completed with systems to-date Fine-tune process work with local systems Competing priorities Develop mechanisms on how to keep people engaged in this process

Discussion Questions 1.To what extent do local programs/agencies in your state use data effectively for improvement? 2.How does your state support and assist local programs/agencies in using data for improvement? 3.What is needed to ensure your local agencies/programs use data on an ongoing basis for continuous improvement?