Linking Early Intervention Quality Practices With Child and Family Outcomes Sherry Franklin, North Carolina Part C Coordinator Deborah Carroll, PhD, Branch.

Slides:



Advertisements
Similar presentations
Project L.O.F.T. Report May 2007 through October 2007 Creating a design to meet stakeholder desires and dissolve our current set of interacting problems.
Advertisements

Analyzing Student Work
Migrant Education Comprehensive Needs Assessment
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
CW/MH Learning Collaborative First Statewide Leadership Convening Lessons Learned from the Readiness Assessment Tools Lisa Conradi, PsyD Project Co-Investigator.
CHANGING ROLES OF THE DIAGNOSTICIAN Consultants to being part of an Early Intervention Team.
Family Resource Center Association January 2015 Quarterly Meeting.
ACCIDENT REVIEW PROCESS. OBJECTIVES After completing this lesson the participants will be able to: Understand the role of an Accident Review Board/Board.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Beth Rous University of Kentucky Working With Multiple Agencies to Plan And Implement Effective Transitions For Head Start Children Beth Rous University.
TIP Webinar Targeted Improvement Planning. ILCD EDN Guidance Document First document to review in preparation for your TIP development. The questions.
Orientation to the Self-Assessment Process in Head Start
1 Strengthening Child Welfare Supervision as a Key Practice Change Strategy Unit I: Helping Child Welfare Leaders Re-conceptualize Supervision.
1 National Training Programme for New Governors 2005 Module 3 Ensuring accountability.
WE’RE ACCREDITED….NOW WHAT? MAINTAINING ACCREDITATION AND SUSTAINING QUALITY Presented by: Julie Leslie & Kim St. Marie.
Internal Auditing and Outsourcing
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
1 School Inspection Update Key Changes since January 2014 Updates continued 17 June 2014 Name Farzana Aldridge – Strategic Director & Caroline Lansdown.
May 20, Purpose of the Self- Assessment Required by the Head Start Performance Standards (i)(1) Head Start Ac 2007 Head Start Act Section.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Maine’s Response to Intervention Implementation: Moving Forward Presented by: Barbara Moody Title II Coordinator Maine Department of Education.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
OSEP National Early Childhood Conference December 2007.
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
Coaches Training Introduction Data Systems and Fidelity.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
Bureau of Indian Education Special Education Academy Using State and Local Data to Improve Results Sandy Schmitz, Ph.,D DAC Tampa, FL September ,
Putting it all together: How Facilitative Administrations can use Performance Assessment and a Comprehensive Assessment System to support Staff Competency.
Massachusetts Part C Department of Public Health (LA) 62 programs, 38 vendor agencies 6 Regions 6 Regional Specialists.
Orientation to the Self-Assessment Process in Head Start.
Participatory Planning Project Cycle Management (PCM)
Military Family Services Program Participant Survey Briefing Notes.
Continuous Improvement and Focused Monitoring System US Department of Education Office of Special Education Programs Overview of the OSEP Continuous Improvement.
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
Chapter 4 Developing and Sustaining a Knowledge Culture
Chapter 4 Developing and Sustaining a Knowledge Culture
Helping Local Programs Improve Outcomes Performance Using the DAC Data Analysis Framework Sherry Franklin, North Carolina Part C Coordinator October 27,
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Managing Records: Good government, Better business. FOI Presentations to Boards & Committees Cayman Islands National Archive November 2008.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Together We Grow North Carolina Early Intervention Services EISAS Parent Survey: Assessment of Early Intervention Service Provider Quality Presenters:
Communimetrics and CQI Stephen Shimshock PhD Michael Martinez MSW Amy Edwards LMSW Yakiciwey Mitchell MSW Angelina Garcia MSW.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Family Outcomes and SSIP State of North Carolina Infant Toddler Program Gary Harmon, PhD Part C Data Manager.
Reduce Waiting & No-Shows  Increase Admissions & Continuation Reduce Waiting & No-Shows  Increase Admissions & Continuation Lessons Learned.
National Child Welfare Resource Center for Organizational Improvement A service of the Children’s Bureau, Member of the T/TA Network Readiness for Systemic.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Coaches Corner: Kathryn Schallmo MiBLSi Director.
In accordance with the Individuals with Disabilities Education Act and Chapters 14 and 15 of the State Board Regulations, PDE provides general supervision.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
1 CREATING AND MANAGING CERT. 2 Internet Wonderful and Terrible “The wonderful thing about the Internet is that you’re connected to everyone else. The.
Content MMS A Professional LearningCommunity By Rick DuFour.
1 Assuring the Quality of Data from the Child Outcomes Summary Form.
1 Early Intervention Monitoring Wyoming DDD April 2008 Training.
Welcome to ABC County DSS REAP Meeting MEETING DATE, 2012.
Understanding the Data on Preschool Child Find and Transition Annual Performance Report Indicator 12 February, 2016
DAC Data Analytics Using State and Local Data to Improve Results.
Sherry Franklin, North Carolina Part C Coordinator
Sherry Franklin, North Carolina Part C Coordinator
Using Data for Program Improvement
Using Data for Program Improvement
Using State and Local Data to Improve Results
Presentation transcript:

Linking Early Intervention Quality Practices With Child and Family Outcomes Sherry Franklin, North Carolina Part C Coordinator Deborah Carroll, PhD, Branch Head May 15, 2012

NC Part C Context  State Lead Agency  17 Local Lead Agencies  13 State Employees  4 Contract Agencies  Community Early Intervention Service Providers

Results Component  Continuous Improvement Visit- November 14-18, 2011  Verification  Results Component  May – June: Results Focus Selection  Family Outcomes  Birth to 1 Child Find

Orienting Programs/Providers to Key Practices that Support Child and Family Outcomes  State Interagency Coordinating Council formed a task group to assist in Results Component Selection  Reviewed Family Outcomes Survey and Process  So What?  Reviewed “Relationship of Quality Practices to Child and Family Outcome Measurement Results”

Orienting Programs/Providers to Key Practices that Support Child and Family Outcomes Families and Stakeholders were able to:  Share their experiences with expected practices  Define common strengths and challenges  Come to consensus on the importance of the skills/training of Early Intervention Service Coordinators

Conducting Self-Assessment of State or Local Performance on Practices FOCUS MONITORING- PROCEDURAL SAFEGUARDS  U SE OF THE T OOL TO T RAIN M ONITORS ON THE P RACTICES T HAT H AVE A D IRECT I MPACT ON “K NOW Y OUR R IGHTS ”  E MBED Q UESTIONS R ELATED TO P RACTICES IN I NTERVIEW T OOLS

NOW WHAT? HOW TO GET STARTED HOW TO IMPLEMENT A STATEWIDE CHANGE IN PRACTICE HOW TO ACHIEVE DESIRED OUTCOME

Data Accountability Center (DAC)

Bottom Line  State-Local Partnership  State Implementation Team  Local Implementation Team  Quality Data  Specific Problem/Issue  Data-based Decisions

Concord Durham Greensboro Rocky Mount Our Partnership Morganton Shelby

Implementation Team  Represents community members and systems stakeholders  Advises and assists systems change  Develops & implements clear plans with assignments of tasks/timelines  Keeps implementation process focused  Solves problems that arise during the process. Fixen, D.L., Naoom, S.F., Blasé, K.A., Freedman, R.M., & Wallace, F. (2005). Implementation Research; A Synthesis of the Literature. Tampa, Florida: University of South Florida, National Implementation Research Network

Local Implementation Team MUST INVOLVE A TEAM  Program Director or designee  Data Person/Quality Assurance staff  Providers/staff involved in issue topic  Parent  Others as needed * Must be a person with influence/authority

State Implementation Team MUST INVOLVE A TEAM  Administrators  Data Person  ICC member  Parent  Others as needed * Must be a person with influence/authority

DAC Framework for Data Use Consists of three phases w/ several steps: Preparation Phase 1. Identify relevant data Inquiry Phase 1.Conduct data analysis 2.Determine Root Cause Action Phase 1.Plan for improvement 2.Implement Plan 3.Evaluate progress

Proactive Versus Reactive Both are Positive A process to review existing data to select priorities for program improvement. A process to determine program compliance and effectiveness. Proactive A process used to respond to a state identified problem. A process used to respond to a locally identified problem. Reactive

DATA QUALITY STANDARDS How do you know whether the data collected/used is of high quality?

Data Quality Standards Timely Accurate  Reliable  Consistent  Objective  Valid  Complete  Credible Secure Useful  Interpretable  Relevant  Transparent  Accessible Data collected, submitted, analyzed, and reported must be:

Initial Questions (Using Available Data) 1.What do you notice? What are the patterns? Trends? 2.What kinds of questions do you have as you look at the available data? 3.What other data might you want to explore to dig into these questions? 4.What is the storyline? 5.What are these data telling you? 6.What do you want to know?

Drill down involves accessing information by starting with a general category and moving through the hierarchy of field to file to record; it is the act of focusing in to get to the root cause. Source: Adapted from Webopedia

A hypothesis is defined as “…a starting-point for further investigation from known facts”. The Concise Oxford Dictionary, 1990)

Why is a Hypothesis Important? A good hypothesis will help you: 1. Focus your investigation 2. Keep you from “losing the forest for the trees” 3. Ensure that you stay on course in your investigation

Data Analysis Plan The analysis plan provides an outline of additional data that need to be analyzed to test the hypothesis and determine root cause ; it helps with preparing a clear and concise presentation of the results of your analysis activities

Root Cause Determining the root cause enables the creation of effective actions to prevent the problem from re- occurring

DATA BASED DECISION MAKING  Data should be used to drive:  Root Cause Analysis leading to a hypothesis(es)  Improvement Planning  Evaluation (effectiveness )

NEXT STEPS Implementation of Pilot experience Use Information for Improvement and Statewide Implementation Strategies

Questions/Comments