1 What did Connecticut do?. 2 The Basics A group of people who hold stakes met to give the lead agency suggestions. We chose the NCSEAM survey and we.

Slides:



Advertisements
Similar presentations
Preschool Special Education A Review of State Performance Indicators and The Child Outreach Network.
Advertisements

Educational Environments Data Collection for Children Ages 3 through 5 RI Department of Education August 22, 2006.
Rhode Island State Systemic Improvement Plan (SSIP) Stakeholder Input November 6, 2014.
HOW TO EXAMINE AND USE FAMILY SURVEY DATA TO PLAN FOR PROGRAM IMPROVEMENT Levels of Representativeness: SIOBHAN COLGAN, ECO AT FPG BATYA ELBAUM, DAC -
Welcome! Review of the National Part C APR Indicator 4 Family Data FFY 2011 ( ) Siobhan Colgan, ECTA, DaSy Melissa Raspa, ECTA.
NH’s PTI Program and Part C Office: Collaborating for the Benefit of Families Presented by: Michelle Lewis (NH PTI Director, ICC Chair)
Results Not Demonstrated AKA National National Picture.
Charlotte Y. Alverson, Ed. S. Secondary Transition State Planning Institute Charlotte, NC May 8, 2008 Blueprint for Success: Helpful Hints for Writing.
1 CDBG Income Survey Requirements For Grant Administrators.
1 Measuring the Oral Health of Washington’s Children Challenges and Practical Solutions.
Linking Early Intervention Quality Practices With Child and Family Outcomes Technical Assistance for Local Early Intervention Systems Infant & Toddler.
How do we get to Full Integration? Implementing Child Outcomes into the Early Intervention Process Here?
Learn How States Are Finding “Hard-to-reach” Students for Post-school Outcome Data Collection! How the Heck Do We Contact Some of Our Former Students?
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
Maureen Sullivan Vermont’s Family Infant and Toddler Program October 7, 2009 Understanding and Utilizing Family Survey Data.
GUIDE TO THE DEVELOPMENT OF IMPROVEMENT ACTIVITIES USING FAMILY SURVEY DATA Jim Henson 1.
Using Data for Program Improvement Christina Kasprzak May, 2011.
1 Trends in Child Outcomes (C-3 / B-7) and Family Outcomes (C-4) Analysis and Summary Report of All States’ Annual Performance Reports Christina.
Targeting OB Offices to Improve Family Understanding of UNHS Susan Wiley, M.D. Maureen Sullivan-Mahoney CCC-A, FAAA March, 2005.
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
Are your C4 data reflective of the families you serve? Joy Markowitz, Director Jean Dauphinee, TA Specialist Measuring Child and Family Outcomes Conference,
OAVSNP 2014 Charlotte Alverson, NPSO Pattie Johnson, TRI Sally Simich, ODE 1.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Interpreting Field Check Tables.
Linking Early Intervention Quality Practices With Child and Family Outcomes Technical Assistance for a Local Early Intervention System Infant & Toddler.
Analyzing COSF Data in Support of Validity Charles R. Greenwood July 25, 2007.
Delaware Birth to Three Early Intervention System Evaluation: Child Outcomes July 15, 2004 Conference Call Series: Measuring Child Outcomes “Examples of.
SPP Indicators B-7 and B-8: Overview and Results to Date for the Florida Prekindergarten Program for Children with Disabilities PreK Coordinators Meeting.
1 Using a Statewide Evaluation Tool for Child Outcomes & Program Improvement Terry Harrison, Part C Coordinator Susan Evans, Autism Project Specialist.
Early Childhood Outcomes Center Data Workshop Getting the Tools You Need for Data-Informed Program Improvement Presented at the OSEP National Early Childhood.
Clay County IIIP Evaluation Project. Clay County 101 Clay County 101 Components of evaluation plan Components of evaluation plan Results of surveys Results.
How to Explain the Numbers: Helping Staff, Parents, and Other Stakeholders Understand the Results of the NCSEAM Surveys for Part C and 619 Batya Elbaum,
A Report on the Texas Parent Survey for Students Receiving Special Education Services DataSource: Statewide Survey of Parents of Students Receiving Special.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Vera Lynne Stroup-Rentier & Sarah Walters. Both models are situated within existing EI programs. This study defined the models as follows: dedicated.
Data analysis for Program Improvement: Part 1 Kathy Hebbeler, ECO at SRI Cornelia Taylor, ECO at SRI.
Parent Input Directions. Parent Input Meeting The District Monitoring Plan team must include a parent (from such sources as CAC, PTI, FRC, FEC).
Family Outcomes Montana’s Method of Partnering with Families.
What Counts: Measuring Benefits of Early Intervention in Hawai`i Beppie Shapiro, University of Hawai`i.
National High School Center Summer Institute What’s the Post-School Outcomes Buzz? Jane Falls Coordinator, National Post-School Outcomes Center Washington,
Understanding and Using the Results from the NCSEAM Family Survey Batya Elbaum, Ph.D. NCSEAM Measuring Child and Family Outcomes NECTAC National TA Meeting.
… what did Delaware do? Ensuring representative sample for Family Survey….
July 2009 Copyright © 2009 Mississippi Department of Education State Performance Plan Annual Performance Report Indicators 8, 11, 12, 13, and 14 July 2009.
Using Family Survey Data for Program Improvement Pam Roush, Director WV Birth to Three October 7, 2009.
Indicator 14 Frequently Asked Questions Frequently Asked Questions Revised May 2010 (Revisions indicated in red font)
The Center for IDEA Early Childhood Data Systems Sharing is Good for Kids Kathy Hebbeler NERCC Part C Coordinators Meeting April 24, 2013.
Family Outcomes and SSIP State of North Carolina Infant Toddler Program Gary Harmon, PhD Part C Data Manager.
SY Special Education Parent Survey Presenter: Donald Griffin Education Program Specialist BIE-DPA, Special Education Unit.
NEW FOCUS nf nf 1 Research Strategy and Implementation Telephone: Facsimile: ACN: 066.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Documenting Family Outcomes: Decisions, Alternatives, Next Steps Don Bailey, Ph.D. Mary Beth Bruder, Ph.D. Contact information: Mary Beth Bruder, Ph.D.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
 ask in writing for evaluation; keep a copy of the request  explain child’s problems and why evaluation is needed  share important information with.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Effective Collaboration between Parent Centers and Early Childhood.
Strategies for Improving Response Rates Minnesota’s First Year Success Family Outcomes Surveys.
Report on the NCSEAM Part C Family Survey Batya Elbaum, Ph.D. National Center for Special Education Accountability Monitoring February 2005.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Connecting the Dots—Linking Indicators for Program Improvement.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Including Parents In Alaska Child Outcomes. Alaska Child Outcomes Development Summer 2005 – General Supervision Enhancement Grant (GSEG) Infant & Toddler.
CT Birth to Three System – ECO 08/ Connecticut’s FY06 Family Survey Sending them out, getting them back, and analyzing the data… Process.
INFORMATION SERVICESPopulation Technical Advisory Committee Copyright © 2010 Population Technical Advisory Committee Roles and Responsibilities January.
Services Provided Survey Tutorial National Cross-Site Evaluation of Juvenile Drug Courts and Reclaiming Futures University of Arizona Southwest Institute.
Virginia Department of Health Staysi Blunt, Evaluator
Part C State Performance Plan/Annual Performance Report:
Using Family Survey Data for Program Improvement
Christina Kasprzak, ECTA/ECO/DaSy September 16, 2013
Using Family Survey Data for Program Improvement
Trends in Child Outcomes (C-3 / B-7) and Family Outcomes (C-4)
Using Family Survey Data for Program Improvement
Head Start Research Conference Washington, DC July 2014
Presentation transcript:

1 What did Connecticut do?

2 The Basics A group of people who hold stakes met to give the lead agency suggestions. We chose the NCSEAM survey and we distributed the surveys to families eligible on specific date (just like 618 Table 1, point in time.) Service Coordinators hand delivered survey in sealed envelopes with a stamped envelope. Surveys (with a unique identifier printed on them) were returned directly to the lead agency and scanned in

3 The Basics In Year One, we also mailed surveys to recently exited families. The response rate for hand-delivered was much higher 26% v 19% so… In Year Two we only sent them to families who were participatING in the program but…. Return rates were low! So we did a follow up mailing ($$$$). Ended up with a rate of 41%.

4 The Basics THIS IS A CENSUS APPROACH for distribution of the surveys. We gave the surveys to ALL ELIGIBLE families who had been in Birth to Three at least 6 months. After year one, OSEP asked Connecticut to clarify the details of what we had done to assure representativeness. READY?

5 This is basically your handout – don’t try to read it. 5

6 We interpreted the “population of children with disabilities in the early intervention program” to mean our official 618 child count which is the only official demographic data reported to OSEP. We called this our Target Group The Details

7 Connecticut reported the percentages by both Race / Ethnicity and Gender for the… –Target Group (FFY child count - Table 1) –Census (Children whose families were sent a survey) –Respondent Pool (Children whose families returned a completed survey) The respondent pool wasn’t representative! and….SURPRISE!

8 The Details

9 Random and Representative So we used SPSS to randomly select cases in order to create groups that would match the 618 percentages. –By Race/Ethnicity –By Gender –A crosstab of both –(We also did “Region” for our ICC.)

10 Race Ethnicity

11 Gender

12 Race/Ethnicity X Gender

13 Analysis Then we analyzed the results for each of the randomly selected representative groups. Then we reported EVERYTHING…

14 Analysis 4A – Know My Rights (77%)

15 Analysis 4B – Communicate About My Child (75%)

16 Analysis 4c – Help Me Help My Child (88%)

17 Remaining Questions “Week of Clarity” Prelim APR table read that CT –Didn’t include the N’s and –Didn’t meet it’s target for Indicator 4C They use the results from our response pool So we revised our APR to –Include the N’s and –Only report results that were from representative data

18 Original APR 4c – Help Me Help My Child (88%)

19 REVISED APR The respondent pool N=875 and from that the following representative groups were selected.

20 Remaining Questions The final APR response table read that CT –Didn’t report the total N for the Response Pool –Still didn’t meet it’s target for Indicator 4C Because we are required to assure representativeness, which results should be reported to and used by OSEP?

21 Dr. Paula LaLinda used Connecticut’s data for her dissertation. She analyzed many more variables including language spoken in the home, insurance type (public or commercial) This was more illustrative. Remaining Questions

22 States may be using their family outcome data in ways that are a priority for stake- holders but not required for the APRs. What are the minimum requirements for the APRs as related to reporting representativeness? Remaining Questions