Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Local Capacity for Data Analysis and Use Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Infant & Toddler Connection of VA Haidee.

Similar presentations


Presentation on theme: "Building Local Capacity for Data Analysis and Use Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Infant & Toddler Connection of VA Haidee."— Presentation transcript:

1 Building Local Capacity for Data Analysis and Use Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Infant & Toddler Connection of VA Haidee Bernstein, Data Accountability Center ( DAC) Beverly Crouse, Infant & Toddler Connection of VA

2 Data Accountability Center Goal Form partnerships in states that join state and local agencies in the use of data to drive improved results

3 Premises Data Use Involves: Working through a Collaborative Team approach. Engaging Team in a Continuous Improvement Process. Relating the Data to specific Problem/Issue. Using Data is an Iterative Process!

4 There is a Process for Using Data to Improve Performance!

5 Important Points for Helping Local Data teams Be Successful How do you maximize data you already collect and collect what you need? How do you maximize data you already collect and collect what you need? How do you organize your staff and your agency around ongoing data use? How do you organize your staff and your agency around ongoing data use? Its all about continuous improvement Its all about continuous improvement Use data to determine priority for focus Use data to determine priority for focus It is important to “drill down” to understand performance to identify meaningful solutions It is important to “drill down” to understand performance to identify meaningful solutions

6 1. Identify relevant data. Conduct data analysis to generate hypothesis 2. Conduct data analysis to generate hypothesis 3. Test Hypothesis to determine root cause. Plan for Improvement 4. Plan for Improvement 5. Evaluate Progress Data Analytics Inquiry Action DATA ACCOUNTABILITY CENTER DATA ANALYTICS

7 Historical Perspective Virginia’s 2008 Determination Status Work with DAC Leadership Academy (April 2010) Data Analysis Modules Historical Perspective

8 Leadership Academy April 2010 Two Sessions Held: Plenary & Breakout  Plenary: overview of use of quality of data  Breakout sessions: Use of actual local data Results: Positive feedback from meeting evaluations Wanted more time to spend on the activity First activity in all CAP or SEP’s developed requires a data analysis be completed Historical Perspective

9 Standards and Principles Use data on a regular basis Use data for continuous improvement Verify the accuracy of your data Make sure you have the right team at each step Own your data Use a process to determine how much data is needed

10 Haidee’s stuff From June through September 2010, a very detailed outline for the Data Analysis Modules was developed. In December 2010, work on the PowerPoint presentation began In February 2011, The PowerPoint slides were put onto SharePoint Today, we are officially showing some of the slides and explaining the process and rationale From Notes to PowerPoint

11 From PowerPoint to Lectora

12 Why engage in this process Types of Data Module 1: Overview Identify people to look at the data Define the problem(s) Module 2: Preparation Identify Relevant Data Conduct Data Analyses to Generate Hypotheses Consider and test Hypotheses Module 3: Inquiry Determine Actionable Causes Develop and Implement Improvement Plans Evaluate Progress Module 4: Action Case Study Module 5: Practice Guide

13 Infant & Toddler Connection of 1 Alexandria11 Danville-Pittsylvania 21 the Highlands 31 Prince William, Manassas and Manassas Park 2 the Alleghany Highlands12 Dickenson 22 Loudoun 32 Rappahannock-Rapidan 3 Arlington 13 Crater District 23 Middle Peninsula-N Neck33 Rappahannock Area 4 the Roanoke Valley 14 the Eastern Shore 24 Mount Rogers 34 the Blue Ridge 5 Central Virginia 15 Fairfax-Falls Church 25 the New River Valley 35 Richmond 6 Chesapeake 16 Goochland-Powhatan 26 Norfolk 36 the Rockbridge Area 7 Chesterfield 17 Hampton-Newport News 27 Shenandoah Valley 37 Southside 8 Williamsburg * James City * York * Poquouson 18 Hanover 28 the Piedmont 38 Valley 9 Planning District 14 19 Harrisonburg-Rockingham 29 LENOWISCO39 Virginia Beach 10 Cumberland Mountain 20 Henrico-Charles City-New Kent 30 Portsmouth 40 Western Tidewater Local Lead Agencies Infant & Toddler Connection of 1 Alexandria11 Danville-Pittsylvania 21 the Highlands 31 Prince William, Manassas and Manassas Park 2 the Alleghany Highlands12 Dickenson 22 Loudoun 32 Rappahannock-Rapidan 3 Arlington 13 Crater District 23 Middle Peninsula-N Neck33 Rappahannock Area 4 the Roanoke Valley 14 the Eastern Shore 24 Mount Rogers 34 the Blue Ridge 5 Central Virginia 15 Fairfax-Falls Church 25 the New River Valley 35 Richmond 6 Chesapeake 16 Goochland-Powhatan 26 Norfolk 36 the Rockbridge Area 7 Chesterfield 17 Hampton-Newport News 27 Shenandoah Valley 37 Southside 8 Williamsburg * James City * York * Poquouson 18 Hanover 28 the Piedmont 38 Valley 9 Planning District 14 19 Harrisonburg-Rockingham 29 LENOWISCO39 Virginia Beach 10 Cumberland Mountain 20 Henrico-Charles City-New Kent 30 Portsmouth 40 Western Tidewater Local Lead Agencies

14 Ways to Use Data Identifying issues Monitoring System Planning Improvement Activities System oversight/management

15 Approach to Improvement Planning Monitoring Consultants TA consultants Local System Planned preparation

16 Possible Reactions Negative Reactions Potential Roadblocks – I do not have time for this – I already know this – I know the problems – I have the solutions Positive Reactions Potential Facilitators – In the long run this will save time – I didn’t know this was possible – This information will help me do my job better – This information will help families

17 Proactive Versus Reactive Both are Positive Check the data to ensure its accuracy Determine program effectiveness. Develop a plan using local data Proactive Use available data to respond to a problem Adjust plans that are in place after conducting data analysis Reactive

18 What is Your Purpose Reactive Example: Responding to an issue such as monitoring results Purpose: To address monitoring results that are below the state target Proactive Example: Conduct quality review or assessment to determine areas of need Purpose: To Proactive look at the quality of data Good Idea

19 How Will Your Team Interact? Who? Can decipher the data? Can bring new ideas? Can make the decisions? Can translate data into policy?

20

21 Pre On-Site Visit With Local System Managers Discuss purpose of data analysis process Discuss potential data team members Identify ITOTS reports to be reviewed Identify data from other sources that need to be reviewed Pull three years worth of data Desk Audit Review and analyze same data as local system Formulate questions about data Identify additional data that may need to be collected

22 First On-Site Visit Inquiry Preparation 3. Identify Relevant Data 4. Conduct Data Analysis to Generate Hypotheses 5. Test Hypotheses to Determine Actionable Causes 1.Define and Articulate the Problem 2.Define the Problem/Issue

23 Beginning the Journey 1.Complete the Preparation Phase and part of Inquiry Phase 2. Review the data reports  “What does the data tell you?”  What are the good things the data is telling you?  What surprises you about the data?  What questions strike you as you look at the data?  What data appears to be missing?  What are the good things the data is telling you?  What data appears to be missing?

24 Review Multiple Source of Data Data from interviews with families, service coordinators, EC community partners, etc. Record Reviews Part C, Local and State data systems Outside sources such as Juvenile Justice, Kids Count, and/or Health Departments During Inquiry Phase

25 Infant & Toddler Connection of Playground City Referral Outcome by Referral Source 7/1/09 – 7/30/10 Referral Sources EvaluatedNot Evaluated Total Referral Source Eval- Ineligible Will Receive Services Total Unable to Contact Declined Screening Declined Eval Total Health99313716 DSS156225915 Doctor’s110114581728 Parent3101316720 Other18111312 Totals7495612102850106

26 Infant and Toddler Connection of Playground City Referral Outcome by Referral Source (7/01/09-7/30/10) 1. Information Local System Gathered through this report: 53% of all referrals are evaluated; 47% are not evaluated – 46% of all referrals will receive services – 6% of all referrals were evaluated ineligible – 11% of all referrals were lost to contracts – 9% of all referrals declined screening 26% of all referrals declined an evaluation 2. Physician Referrals: 26% of all referrals 3. Parent Referrals: 19% of all referrals 4. Health: 15% of all referrals 5. Dept. of Social Services: 14% of all referrals A. Physician Referrals: 39% were evaluated; 69% were not evaluated. Of those not evaluated, 39% declined either screening or evaluation. B. Family Referrals: 65% were evaluated; 35% were not evaluated. Of those not evaluated, 100% declined screening or evaluation. C. Health Referrals: 56% were evaluated; 44% were not evaluated. Of those not evaluated, 43% were lost to contact and 57% declined screening or evaluation D. DSS Referrals: 40% were evaluated; 60% were not evaluated. Of those not evaluated, 22% were lost to contact and 78% declined screening or evaluation

27 Additional Data Needed What is the average age of referrals? Which physicians are referring?  Specific name versus name of practice  What is the average age of the physician referral? How do families hear about Part C services? Why are families declining Part C services?  At what point in the process are families declining Part C services?

28 Second On-Site Visit Inquiry 3. Identify Relevant Data 4. Conduct Data Analysis to Generate Hypotheses 5. Test Hypotheses to Determine Actionable Causes

29 What’s Accomplished? Summarize discussion from previous visitReview data collected in-between visits Formulate hypotheses Hypothesis is a proposition or supposition tentatively accepted to explain certain facts or to provide a basis for further investigation. Identify strategies to test hypotheses

30 Data Collection (8/1/10 – 11/30/10) Average age of referral: 16 months 10 referrals received from physicians: – Average age of referral: 14 months Dr. Swingset: 0 referrals received < 18 months Dr. Sandbox: 0 referrals received < than 24 months Dr. Bottle: average age of referral is 9 months; 50% of declined a screening No referrals from the NICU at the ABC hospital

31 Data Collection (8/1/10 – 11/30/10) 12 referrals received from family’s: – 7 Families declined services: 57% of families felt their child was developing at age level 43% of families wanted to receive services through a private agency 5 families declined a developmental screening 2 families declined Assessment for Service Planning

32 ITC Playground City Hypotheses  Physicians are not referring children at very young ages  Physicians are not providing families with a complete explanation of early intervention and reason for referral  Hospitals are not referring premature babies

33 Final On-Site Visit Action 6.Develop and Implement Improvement Plan 7. Evaluate Progress Inquiry 3. Identify Relevant Data 4. Conduct Data Analysis to Generate Hypotheses 5. Test Hypotheses to Determine Actionable Causes

34 Final On-Site Visit Moving from inquiry to action Review Hypotheses Were we correct? Do we need to re-look at data to formulate new or additional hypotheses? Improvement Planning and Evaluating Progress Consider priorities Sphere of Influence Use data to determine if moving in right direction

35 ITC Playground City Improvement Plan Plan to address increase in referrals of premature babies from NICU:  Identify Discharge Social Workers, Nurses or Therapists responsible for referrals  Meet with individuals  Gather data from hospital (# of premature births residing in their community, where are referrals being made)  Provide information about EI in Virginia  Collaboratively develop mechanism to meet with family prior to NICU discharge

36 What’s Next? Develop mechanism to introduce to local systems Complete data analysis modules Analysis work completed with systems to-date Fine-tune process work with local systems Competing priorities Develop mechanisms on how to keep people engaged in this process

37 Things to Remember States can assist local agencies/programs to remember: It is all about improved quality of services for children and families Hard to let go of traditional improvement planning Hard to let go of your own sense of what the problem/solution is Follow the data where it leads you Ask the difficult questions Create an environment where solutions are generated


Download ppt "Building Local Capacity for Data Analysis and Use Sharon Walsh, Data Accountability Center (DAC) Mary Anne White, Infant & Toddler Connection of VA Haidee."

Similar presentations


Ads by Google