Building Local Capacity for Data Analysis and Use Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Infant & Toddler Connection of VA Haidee.

Slides:



Advertisements
Similar presentations
MAPP Process & Outcome Evaluation
Advertisements

School Leadership Team Fall Conference West Virginia Department of Education Division of Educator Quality and System Support Bridgeport Conference Center.
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Intro. Website Purposes  Provide templates and resources for developing early childhood interagency agreements and collaborative procedures among multiple.
MSDE Alternative Governance Plan Development School: James Madison Middle School January 2012.
5/2010 Focused Monitoring Stakeholders May /2010 Purpose: Massachusetts Monitoring System  Monitor and evaluate program compliance with federal.
A Community Idea For A Better Future: The Pulaski County Commission on Children and Families John Bumgarner Project Associate, Institute for Policy Outreach.
Determining Your Program’s Health and Financial Impact Using EPA’s Value Proposition Brenda Doroski, Director Center for Asthma and Schools U.S. Environmental.
The Center for IDEA Early Childhood Data Systems Under the Hood with the DaSy System Design and Development Framework: The mechanics of a high performance.
July 2013 IFSP and Practice Manual Revisions April 29, 2013 May 3, 2013 Infant & Toddler Connection of Virginia Practice Manual Infant & Toddler Connection.
An Introduction to Project NO REST February 11, 2015
National Community of Practice on Transition Past, Present and Future.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
Office of Child Development & Early Learning Tom Corbett, Governor | Ronald Tomalis, Secretary of Education | Gary D. Alexander, Secretary of Public Welfare.
Orientation to the Self-Assessment Process in Head Start
RESULTS DRIVEN ACCOUNTABILITY SSIP Implementation Support Activity 1 OFFICE OF SPECIAL EDUCATION PROGRAMS.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
Parent Introduction to School-wide Positive Behavior Supports (SW-PBS)
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
CONNECTICUT ACCOUNTABILTY FOR LEARNING INITIATIVE Executive Coaching.
Helping Families Receive the Best Start in Life.  Check In  AOK History  AOK Communities  Conceptual Framework  Advancing Collaborative Leadership.
Administrator Checklist Research and Training Center on Service Coordination.
9/2/20151 Ohio Family and Children First An overview of OFCF structure, membership, and responsibilities.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Using Part C Data Analytic Modules for Program Improvement.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
This product was developed by Florida’s Positive Behavior Support Project through University of South Florida, Louis de la Parte Florida Mental Health.
Intro to Positive Behavior Interventions & Supports (PBiS)
OSEP National Early Childhood Conference December 2007.
The Baltimore City Student Attendance Work Group Coalition for Community Schools 2010 National Forum Building Innovative Partnerships for Student Success.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
THINK. LEARN. DECIDE. ARLINGTON, VIRGINIA SECOND CHANCE PROGRAM Presenters Mary Hynes | Arlington County Board Abby Raphael | Arlington School.
Assistant Principal Meeting August 28, :00am to 12:00pm.
Chase Bolds, M.Ed, Part C Coordinator, Babies Can’t Wait program Georgia’s Family Outcomes Indicator # 4 A Systems Approach Presentation to OSEP ECO/NECTAC.
OT/PT Dept Awaiting Service to Active Contact family to set appointment On the day you first try to call the family, discharge from AWAITING SERVICE, re-admit.
OSEP National Early Childhood Conference December 2007.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
School Leadership Module Preview This PowerPoint provides a sample of School Leadership Module PowerPoint. The actual Overview PowerPoint is 73 slides.
PMS Implementation. Implementing PMS This requires the involvement of lot of players Successful implementation requires a clear understanding of how the.
1 Infant & Toddler Connection of Virginia Early Intervention System Presentation for Financing Systems Workshop OSEP National Early Childhood Conference.
Welcome! Please join us via teleconference: Phone: Code:
Intro to Positive Behavior Supports (PBiS) Vermont Family Network March 2010.
A NEW SYSTEM OF SUPPORT FOR INFANTS AND TODDLERS WITH DISABILITIES Recent Changes in the Provision of Early Intervention for Infants and Toddlers with.
ANNUAL PLANNING REGION MEETING May 28, 2009, 11-1.
Katie A. Learning Collaborative For Audio, please call: Participant code: Please mute your phone Building Child Welfare and Mental.
Bureau of Indian Education Special Education Academy Using State and Local Data to Improve Results Sandy Schmitz, Ph.,D DAC Tampa, FL September ,
1 Family Resources and Supports Institute 2012 One Door: Early Start and Prevention Resource and Referral Services (PRRS) Susan Roddy, PRRS Director Sherry.
Linking Early Intervention Quality Practices With Child and Family Outcomes Sherry Franklin, North Carolina Part C Coordinator Deborah Carroll, PhD, Branch.
The Community Collaboration Coaches Roles, Strategies, and Tools.
1 General Supervision. 2 General Supervision (and Continuous Improvement) 1.What are the minimum Components for General Supervision ? 2.How do the Components.
1 Charting the Course: Smoother Data Sharing for Effective Early Childhood Transition Wisconsin’s Journey Lori Wittemann, Wisconsin Department of Health.
Welcome Home Baby Report to the First Steps Commission July 31, 2014.
Mountains and Plains Child Welfare Implementation Center Maria Scannapieco, Ph.D. Professor & Director Center for Child Welfare UTA SSW National Resource.
1 Statewide Screening Collaborative July 30, 2013 Prevention Resource and Referral Services (PRRS) Susan Roddy, PRRS Project Director.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
By: Angela Martinez Education Specialist - Early Childhood Programs Division of Performance and Accountability *****Coordinated Services***** Community.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
Infants, Toddlers, & Young Children with Disabilities ECSE 641 Spring 2015 (Lee, 2010)
Child & Family Connections #14. What is Child and Family Connections The Early Intervention Program in Illinois State funded program to assist families.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
1 This project was supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under.
DAC Data Analytics Using State and Local Data to Improve Results.
Virginia Nicu early intervention collaborative
Child Outcomes Summary Process April 26, 2017
IDC Center and DaSy Center
Strategy
Using State and Local Data to Improve Results
Presentation transcript:

Building Local Capacity for Data Analysis and Use Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Infant & Toddler Connection of VA Haidee Bernstein, Data Accountability Center ( DAC) Beverly Crouse, Infant & Toddler Connection of VA

Data Accountability Center Goal Form partnerships in states that join state and local agencies in the use of data to drive improved results

Premises Data Use Involves: Working through a Collaborative Team approach. Engaging Team in a Continuous Improvement Process. Relating the Data to specific Problem/Issue. Using Data is an Iterative Process!

There is a Process for Using Data to Improve Performance!

Important Points for Helping Local Data teams Be Successful How do you maximize data you already collect and collect what you need? How do you maximize data you already collect and collect what you need? How do you organize your staff and your agency around ongoing data use? How do you organize your staff and your agency around ongoing data use? Its all about continuous improvement Its all about continuous improvement Use data to determine priority for focus Use data to determine priority for focus It is important to “drill down” to understand performance to identify meaningful solutions It is important to “drill down” to understand performance to identify meaningful solutions

1. Identify relevant data. Conduct data analysis to generate hypothesis 2. Conduct data analysis to generate hypothesis 3. Test Hypothesis to determine root cause. Plan for Improvement 4. Plan for Improvement 5. Evaluate Progress Data Analytics Inquiry Action DATA ACCOUNTABILITY CENTER DATA ANALYTICS

Historical Perspective Virginia’s 2008 Determination Status Work with DAC Leadership Academy (April 2010) Data Analysis Modules Historical Perspective

Leadership Academy April 2010 Two Sessions Held: Plenary & Breakout  Plenary: overview of use of quality of data  Breakout sessions: Use of actual local data Results: Positive feedback from meeting evaluations Wanted more time to spend on the activity First activity in all CAP or SEP’s developed requires a data analysis be completed Historical Perspective

Standards and Principles Use data on a regular basis Use data for continuous improvement Verify the accuracy of your data Make sure you have the right team at each step Own your data Use a process to determine how much data is needed

Haidee’s stuff From June through September 2010, a very detailed outline for the Data Analysis Modules was developed. In December 2010, work on the PowerPoint presentation began In February 2011, The PowerPoint slides were put onto SharePoint Today, we are officially showing some of the slides and explaining the process and rationale From Notes to PowerPoint

From PowerPoint to Lectora

Why engage in this process Types of Data Module 1: Overview Identify people to look at the data Define the problem(s) Module 2: Preparation Identify Relevant Data Conduct Data Analyses to Generate Hypotheses Consider and test Hypotheses Module 3: Inquiry Determine Actionable Causes Develop and Implement Improvement Plans Evaluate Progress Module 4: Action Case Study Module 5: Practice Guide

Infant & Toddler Connection of 1 Alexandria11 Danville-Pittsylvania 21 the Highlands 31 Prince William, Manassas and Manassas Park 2 the Alleghany Highlands12 Dickenson 22 Loudoun 32 Rappahannock-Rapidan 3 Arlington 13 Crater District 23 Middle Peninsula-N Neck33 Rappahannock Area 4 the Roanoke Valley 14 the Eastern Shore 24 Mount Rogers 34 the Blue Ridge 5 Central Virginia 15 Fairfax-Falls Church 25 the New River Valley 35 Richmond 6 Chesapeake 16 Goochland-Powhatan 26 Norfolk 36 the Rockbridge Area 7 Chesterfield 17 Hampton-Newport News 27 Shenandoah Valley 37 Southside 8 Williamsburg * James City * York * Poquouson 18 Hanover 28 the Piedmont 38 Valley 9 Planning District Harrisonburg-Rockingham 29 LENOWISCO39 Virginia Beach 10 Cumberland Mountain 20 Henrico-Charles City-New Kent 30 Portsmouth 40 Western Tidewater Local Lead Agencies Infant & Toddler Connection of 1 Alexandria11 Danville-Pittsylvania 21 the Highlands 31 Prince William, Manassas and Manassas Park 2 the Alleghany Highlands12 Dickenson 22 Loudoun 32 Rappahannock-Rapidan 3 Arlington 13 Crater District 23 Middle Peninsula-N Neck33 Rappahannock Area 4 the Roanoke Valley 14 the Eastern Shore 24 Mount Rogers 34 the Blue Ridge 5 Central Virginia 15 Fairfax-Falls Church 25 the New River Valley 35 Richmond 6 Chesapeake 16 Goochland-Powhatan 26 Norfolk 36 the Rockbridge Area 7 Chesterfield 17 Hampton-Newport News 27 Shenandoah Valley 37 Southside 8 Williamsburg * James City * York * Poquouson 18 Hanover 28 the Piedmont 38 Valley 9 Planning District Harrisonburg-Rockingham 29 LENOWISCO39 Virginia Beach 10 Cumberland Mountain 20 Henrico-Charles City-New Kent 30 Portsmouth 40 Western Tidewater Local Lead Agencies

Ways to Use Data Identifying issues Monitoring System Planning Improvement Activities System oversight/management

Approach to Improvement Planning Monitoring Consultants TA consultants Local System Planned preparation

Possible Reactions Negative Reactions Potential Roadblocks – I do not have time for this – I already know this – I know the problems – I have the solutions Positive Reactions Potential Facilitators – In the long run this will save time – I didn’t know this was possible – This information will help me do my job better – This information will help families

Proactive Versus Reactive Both are Positive Check the data to ensure its accuracy Determine program effectiveness. Develop a plan using local data Proactive Use available data to respond to a problem Adjust plans that are in place after conducting data analysis Reactive

What is Your Purpose Reactive Example: Responding to an issue such as monitoring results Purpose: To address monitoring results that are below the state target Proactive Example: Conduct quality review or assessment to determine areas of need Purpose: To Proactive look at the quality of data Good Idea

How Will Your Team Interact? Who? Can decipher the data? Can bring new ideas? Can make the decisions? Can translate data into policy?

Pre On-Site Visit With Local System Managers Discuss purpose of data analysis process Discuss potential data team members Identify ITOTS reports to be reviewed Identify data from other sources that need to be reviewed Pull three years worth of data Desk Audit Review and analyze same data as local system Formulate questions about data Identify additional data that may need to be collected

First On-Site Visit Inquiry Preparation 3. Identify Relevant Data 4. Conduct Data Analysis to Generate Hypotheses 5. Test Hypotheses to Determine Actionable Causes 1.Define and Articulate the Problem 2.Define the Problem/Issue

Beginning the Journey 1.Complete the Preparation Phase and part of Inquiry Phase 2. Review the data reports  “What does the data tell you?”  What are the good things the data is telling you?  What surprises you about the data?  What questions strike you as you look at the data?  What data appears to be missing?  What are the good things the data is telling you?  What data appears to be missing?

Review Multiple Source of Data Data from interviews with families, service coordinators, EC community partners, etc. Record Reviews Part C, Local and State data systems Outside sources such as Juvenile Justice, Kids Count, and/or Health Departments During Inquiry Phase

Infant & Toddler Connection of Playground City Referral Outcome by Referral Source 7/1/09 – 7/30/10 Referral Sources EvaluatedNot Evaluated Total Referral Source Eval- Ineligible Will Receive Services Total Unable to Contact Declined Screening Declined Eval Total Health DSS Doctor’s Parent Other Totals

Infant and Toddler Connection of Playground City Referral Outcome by Referral Source (7/01/09-7/30/10) 1. Information Local System Gathered through this report: 53% of all referrals are evaluated; 47% are not evaluated – 46% of all referrals will receive services – 6% of all referrals were evaluated ineligible – 11% of all referrals were lost to contracts – 9% of all referrals declined screening 26% of all referrals declined an evaluation 2. Physician Referrals: 26% of all referrals 3. Parent Referrals: 19% of all referrals 4. Health: 15% of all referrals 5. Dept. of Social Services: 14% of all referrals A. Physician Referrals: 39% were evaluated; 69% were not evaluated. Of those not evaluated, 39% declined either screening or evaluation. B. Family Referrals: 65% were evaluated; 35% were not evaluated. Of those not evaluated, 100% declined screening or evaluation. C. Health Referrals: 56% were evaluated; 44% were not evaluated. Of those not evaluated, 43% were lost to contact and 57% declined screening or evaluation D. DSS Referrals: 40% were evaluated; 60% were not evaluated. Of those not evaluated, 22% were lost to contact and 78% declined screening or evaluation

Additional Data Needed What is the average age of referrals? Which physicians are referring?  Specific name versus name of practice  What is the average age of the physician referral? How do families hear about Part C services? Why are families declining Part C services?  At what point in the process are families declining Part C services?

Second On-Site Visit Inquiry 3. Identify Relevant Data 4. Conduct Data Analysis to Generate Hypotheses 5. Test Hypotheses to Determine Actionable Causes

What’s Accomplished? Summarize discussion from previous visitReview data collected in-between visits Formulate hypotheses Hypothesis is a proposition or supposition tentatively accepted to explain certain facts or to provide a basis for further investigation. Identify strategies to test hypotheses

Data Collection (8/1/10 – 11/30/10) Average age of referral: 16 months 10 referrals received from physicians: – Average age of referral: 14 months Dr. Swingset: 0 referrals received < 18 months Dr. Sandbox: 0 referrals received < than 24 months Dr. Bottle: average age of referral is 9 months; 50% of declined a screening No referrals from the NICU at the ABC hospital

Data Collection (8/1/10 – 11/30/10) 12 referrals received from family’s: – 7 Families declined services: 57% of families felt their child was developing at age level 43% of families wanted to receive services through a private agency 5 families declined a developmental screening 2 families declined Assessment for Service Planning

ITC Playground City Hypotheses  Physicians are not referring children at very young ages  Physicians are not providing families with a complete explanation of early intervention and reason for referral  Hospitals are not referring premature babies

Final On-Site Visit Action 6.Develop and Implement Improvement Plan 7. Evaluate Progress Inquiry 3. Identify Relevant Data 4. Conduct Data Analysis to Generate Hypotheses 5. Test Hypotheses to Determine Actionable Causes

Final On-Site Visit Moving from inquiry to action Review Hypotheses Were we correct? Do we need to re-look at data to formulate new or additional hypotheses? Improvement Planning and Evaluating Progress Consider priorities Sphere of Influence Use data to determine if moving in right direction

ITC Playground City Improvement Plan Plan to address increase in referrals of premature babies from NICU:  Identify Discharge Social Workers, Nurses or Therapists responsible for referrals  Meet with individuals  Gather data from hospital (# of premature births residing in their community, where are referrals being made)  Provide information about EI in Virginia  Collaboratively develop mechanism to meet with family prior to NICU discharge

What’s Next? Develop mechanism to introduce to local systems Complete data analysis modules Analysis work completed with systems to-date Fine-tune process work with local systems Competing priorities Develop mechanisms on how to keep people engaged in this process

Things to Remember States can assist local agencies/programs to remember: It is all about improved quality of services for children and families Hard to let go of traditional improvement planning Hard to let go of your own sense of what the problem/solution is Follow the data where it leads you Ask the difficult questions Create an environment where solutions are generated