Download presentation
Presentation is loading. Please wait.
Published byChloe Joseph Modified over 9 years ago
1
Approaches to Measuring Child Outcomes Kathy Hebbeler ECO at SRI International Prepared for the NECTAC National Meeting on Measuring Child and Family Outcomes, Albuquerque, NM April 2006
2
2 What is happening Outcomes measurement is difficult and very complex No one group has all the answers – or even most of the answers There are some exciting things going on around the country
3
3 Purpose of the meeting Share our challenges and what we have learned so far Contribute to the collective knowledge base Advance the discussion incrementally moving closer to producing outcomes data *** for the ultimate good of children and families ***
4
4 Critical events Spring 2005 – ECO submitted recommendations to OSEP on what should be collected with regard to child and family outcomes Summer 2005 – OSEP released the reporting requirements December 2005 – States submitted their plans for outcome data collection in their State Performance Plan Spring 2006 – States are collecting data
5
5 The Number One Question*: What are other states doing? Have other states done X? *In two forms
6
6 Purpose: The Overriding Question Why is a state collecting data on child outcomes? Context, resources, values, etc. enter into the answer
7
7 To respond to federal reporting requirements To meet provider/teacher, local and/or state need for outcome information and to respond to federal reporting requirements Purpose WHY?
8
Context & Values Drive Decisions Resources? Stakeholder input? Burden on locals? Standardized assessment? Authentic assessment? Interagency issues? Local control? Minimize change? Early learning guidelines? Multiple sources of information? Policymakers want data? Other early childhood initiatives?
9
9 Who is included in the outcomes system? Pt. C system; Pt. B system Some blending of C and B Same assessment Data sharing Data linking Early Childhood System that includes C and B
10
10 How does the state get data on outcomes? Who provides? What assessments are used? How often is data collected? When is data collected? (When is it reported?) Dealing with multiple sources? Dealing with different assessments?
11
How Outcomes Data Get to the State Agency
12
12 Analysis of SPPs Analyses are based on SPP reports submitted in December 2005 Pt. C N=56; Pt B N= 58 Limitations Variation in level of detail provided Landscape keeps changing Analysis done by Lynne Kahn and staff at UNC/FPG
13
13 “Camera” Issue: Capturing Child Functioning What are the sources of the information on child functioning? What kind of assessment tools are states planning to use?
14
14 Capturing Child Functioning: How many sources? Multiple sources Pt. C: 50% (28 states) Pt. B: 16% (16 states) One data source Pt C: 39% (22 states) Assessment instrument (21 states) Pt B: 55% (32 states) Assessment instrument (31 states)
15
15 Issues Raised Data needs to reflect a child’s functioning in each broad outcome area Functional outcomes summarize each child’s current functioning across settings and situations Best practice for assessing young children recommends the use of multiple measures Will single sources (= assessment tool) produce valid data on functional outcomes? How good is the camera? Will single sources (= assessment tool) produce valid data on functional outcomes? How good is the camera?
16
16 The Child Outcomes Children have positive social relationships Children acquire and use knowledge and skills Children take appropriate action to meet their needs
17
17 Part C Outcomes Data Sources Data Source #% Formal assessment instruments 4580% Parent report2545% Observation1425% Clinical opinion1018% IFSP goals & objectives611% Record review47% Not reported611%
18
18 Preschool Outcomes Data Sources Data Source #% Formal assessment instruments 4580% Observation1221% Parent report1119% Teacher/provider report814% IEP goals & objectives12% Clinical opinion12% Not reported1017%
19
19 Role of Families Impossible to understand how a child is functioning across a variety of everyday settings and situations without family input Options Incorporated into the assessment tool Collected through a parent-completed tool Incorporated into a summary rating Issue: How is information from families being included? Issue: How is information from families being included?
20
20 Capturing Child Functioning: Approaches to identifying assessment tools One assessment selected by state List of assessments developed by state; programs pick Programs can use whatever they have been using
21
21 Capturing Child Functioning: Assessment Tools Being Used Part C – 20 different assessment tools identified 3 states using state developed tool Part B – 43 different assessment tools identified 7 states using state-developed tool
22
22 Commonly Reported Assessment Instruments: Part C Of 28 states who listed specific assessment instruments: HELP - 15 states BDI/BDI2 - 13 states AEPS - 11 states Creative Curriculum - 6 states ELAP- 6 states Not reported – 30 states Not yet determined - 23 states
23
23 Commonly Reported Assessment Instruments: Preschool Of 31 states who listed specific assessment instruments: BDI/BDI2 - 9 states Creative Curriculum - 8 states Brigance- 7 states High Scope COR - 6 states AEPS - 5 states State developed assessments - 7 states Not reported - 27 states Not yet determined – 21 states
24
24 Capturing Child Functioning: Combining Information from Multiple Sources Part C : Using ECO Summary Form – 52% (29 states) Developing own summary tools – 7% (4 states) Part B: Using ECO Summary Form – 29% (17 states) Developing own summary tools – 10% (6 states)
25
25 Capturing Child Functioning: Timing When and how often outcome information is being collected is related to why state is collecting data What assessment is also related to why OSEP requirement is entry and exit
26
26 When will data be “collected”? Aligned around the naturally occurring data review points in programs “Collected” may mean Data reviewed/summarized to determine a functional level for each of the outcomes Summary rating or other data reported to state or OSEP Some states did not report anything beside at entry and exit; (C – 28 states; B - 15 states)
27
27 When entry data will be collected (three general patterns) ReferralEligibility Initial IFSP- e.g. goals, services, settings IFSP 6 month review- intervention planning Around eligibility (based on evaluation data) Initial IFSP (based on eval and assessment data) After services begin (based on eval, assess, and/or ongoing progress monitoring data
28
28 Part C examples of when data will be collected W/in 45 days of referral- 7 states W/in 1 month of IFSP- 2 states W/in 6 months of enrollment- 1 At initial IFSP and 6 month and annual reviews- 21 W/in 2 months, 45 days, 3 months, 6 months of exit
29
29 Preschool examples of when data will be collected Initial evaluations/eligibility - 7 states Initial IEP development -4 states Annual IEP reviews - 11 states Time periods prescribed by curriculum referenced tools (2 or 3 times a year)- 8 states Annually at the end of the school year- 6 states
30
30 Which Children Will Be Included: Part C All children – 40 states After a pilot or phase in period- 16 states Sampling – 7 states 1 sampling at exit (all children will have entry data) 1 sampling at entry (will only collect entry and exit data on children in sample) Other 5 - could not tell from SPP Not reported or undecided - 9 states
31
31 Which Children will be Included: Preschool All children – 42 states After a pilot or phase in period- 15 states Sampling - 8 states 3 will collect data on ALL children, but select a sample to report to OSEP 1 sampling at entry (will only collect entry and exit data on children in sample) Other 4 - could not tell from SPP Not reported or undecided- 8 states
32
32 Collaboration between C and B 25 states reported in Part C SPP collaborating with Part B on outcomes 21 states reported in Part B SPP collaborating with Part C on outcomes
33
33 Collaboration with Other Early Childhood Initiatives Collaborate with or align outcome efforts with broader early childhood accountability initiatives in their state Part C: 3 states Part B: 18 states Issue: What are the outcomes being assessed in the broader initiatives? Issue: What are the outcomes being assessed in the broader initiatives?
34
34 Role of the Early Learning Guidelines May change or add to the outcomes questions Are children meeting the ELGs? May mean mapping the ELGs to the 3 OSEP outcomes Aligning with ELGs: Part C – 8 states; Part B – 18 states
35
How Outcomes Data Get to the State Agency
36
36 Transfer Issues: How does information move? In what form? At what level of detail? With what level of identification?
37
37 In what form? Online In an electronic file On paper
38
38 At what level of detail? Child Level Data Item level data on the child (from an online assessment system) Scores on assessment tool ECO Summary Rating OSEP Categories (a, b, c) Other? Aggregated Data Scores, Rating, OSEP categories, etc.
39
39 With what level of identification? Only relevant for child-level data Can state link outcome data be linked to other information though an ID? Does it enter the system already linked? Linkage to other data has major implications for analysis and questions state will be able to answer Linkage to other data has major implications for analysis and questions state will be able to answer
40
40 Training Focused on various topics Training in assessment tools Training in use of the ECO Summary Form Various approaches Various levels of investment ECO is developing materials and compiling training materials for web site (including materials designed for parents) Contact NECTAC or ECO for help
41
41 Conclusions States are building many different kinds of outcomes measurement systems Features of the system reflect the contexts and values of the state We know some things about what states are doing but the landscape keeps changing
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.