Download presentation
Presentation is loading. Please wait.
Published byMyrtle Ford Modified over 6 years ago
1
Child Assessment: Background, UPK Data Analysis, and Next Steps
September 2009
2
Points to Consider What are other states doing now and can we learn from this? Should the state require programs to use one tool or continue to allow choice from four separate tools? Should other tools be considered? What does proficiency across domains look like for children? Are there certain results that are considered “good?” As measured by these tools, are particular domains more predictive of later outcomes? What do the assessment tools tell us about child progress and what is it specifically we want to measure?
3
Statement of Intent Massachusetts is in the early stages of developing a statewide system to measure developmental progress of its young children EEC is engaging parents, providers, program administrators, teachers, higher education institutions, and policy makers to build a responsive approach This initiative is separate from (and would not replace) developmental information that programs gather about children to use for curriculum planning and to individualize instruction
4
Measuring School Readiness Across the Country
Massachusetts is joining rapidly growing trend to understand school readiness 29 states currently collect statewide data on children’s progress These efforts are often directly connected to state-funded preschool efforts
5
Statewide Measures of School Readiness vs
Statewide Measures of School Readiness vs. Child Assessments and Screenings Currently, many providers are already using a developmental assessment or screening tool to inform practice and individualize instruction Providers are currently using a variety of assessment measures UPK grantees are required to use one of four assessment systems Creative Curriculum Developmental Continuum Ages & Stages High Scope Child Observation Record (COR) Work Sampling System Statewide measurement of school readiness is not intended to be used to replace program-level assessment practices
6
Statewide Measures of School Readiness vs
Statewide Measures of School Readiness vs. Child Assessments and Screenings Different purposes Statewide system: information about the success of all children in Massachusetts Program-level assessments: information for parents and caregivers about individual children Different level of information about child Statewide system: measure a small number of indicators of school readiness Program-level assessments: comprehensive look at child progress across all developmental domains
7
Different usefulness to providers
Statewide Measures of School Readiness vs. Child Assessments and Screenings Different usefulness to providers Statewide system: for broader policy purposes Program-level assessments: to help provider support each child’s growth and development Other differences Statewide system: children will be anonymous when data are reported
8
FY09 UPK Classroom Quality Spring Assessment Data Analysis
9
UPK Assessment Tools UPK assessment tools were chosen based on a number of factors. Each of the tools, Was already being used in the field (determined by a survey to programs) Aligns with the Guidelines for Preschool Learning Experiences* Covers all domains of child development Has online components (excluding Ages & Stages) *Each tool aligns with the Guidelines but they do not align with each other.
10
UPK Data Reporting - Context
FY09 UPK Classroom Quality grantees were required to submit “spring” (conducted between January 1, 2009 and June 30, 2009) assessment data to EEC Purpose: For EEC to monitor that UPK programs (designated UPK classrooms and family child care homes) are using one of the four EEC-approved child assessment systems and regularly assessing and entering child data in their systems* *At the minimum, UPK programs are required to assess children during the fall and spring of each year.
11
Communication to Grantees Regarding Assessment Data Use by EEC
The exact language communicated to grantees about how data will be used is as follows: “EEC will only be monitoring aggregated classroom-level reports to confirm that programs are assessing children in UPK- funded classrooms or homes. EEC will not examine individual child data and will not use this information to assess the strengths or weaknesses of individual programs. EEC may analyze statewide assessment information to inform future planning and professional development opportunities.”
12
Summary of Data Submitted
Table 1: Number of Programs/Providers that Submitted FY09 Spring Data (with response rate) by Assessment Tool and Program Type Assessment Tool Program Type Creative Curriculum High Scope COR Work Sampling Ages & Stages Total Centers 72 (93%) 19 (95%) 28 (100%) - 119 (95%) FCC System 23 (100%) 44 (98%) 67 (99%) FCC Independent 2 (100%) 1 (100%) 3 (100%) Public Schools 4 (100%) 9 (90%) 13 (93%) Private Schools Total Programs/Providers 102 (95%) 38 (97%) 203 (96%) *Programs/providers that received new FY09 UPK Classroom Quality grants (82 programs/providers) were not required to submit Spring data and are not included in the total.
13
Data Submitted, Continued
Table 2: Number of UPK Programs, Classrooms and Children for Whom FY09 Spring Data was Submitted by Assessment Tool Number of Programs Number of Classrooms/Homes Number of UPK Children Assessment Tool Creative Curriculum 102 198 3,557 High Scope COR 19 37 533 Work Sampling 38 77 993 Ages & Stages* 44 Total 203 356 5,127 *Providers using Ages & Stages were required to submit an assessment for one preschool-aged child in the home, while center-based, public and private school programs were required to submit assessment reports that included data for all children in UPK classrooms. Note: Please see Appendix A for additional information about data submitted by assessment tool.
14
Creative Curriculum – Sample Data
The chart below shows data from the Language Development domain as part of the “Snapshot” report from FY09 spring data for UPK programs using Creative Curriculum online.
15
Work Sampling– Sample Data
The chart below shows data from the Language and Literacy domain as part of the “Outcomes” report from FY09 spring data for UPK programs using Work Sampling online.
16
Online Systems Lessons Learned
Online systems include data on non-UPK classrooms UPK programs are permitted to use the online system for all children in their program regardless of UPK status Assessment Planning grantees Non-UPK programs from agencies The number of UPK children assessed are a combination of actual data submitted and estimates based on the number of UPK classrooms Limitations within the online systems prevent distinguishing between children in UPK-funded and non-UPK classrooms In some cases, the number of children we expected to have data for is less than the actual number (i.e. it was estimated that the number of UPK children in programs using Work Sampling online is 1,224, however, there is only data for 885 children identified as “UPK” in the online system (this could be due to programs not identifying UPK children as such)).
17
Online Systems Lessons Learned, Continued
Reports from the online systems vary amongst the publishers Creative Curriculum and Work Sampling allow you to run aggregated reports that include only UPK programs (including non-UPK classrooms) High Scope COR allows only statewide reports, which include non-UPK programs Work Sampling does not allow reports to be run on all preschool children - you must choose between preschool 3, preschool 4, Head Start 3 and Head Start 4 Data from each assessment tool may not be comparable among the tools While a high percentage of the programs using Work Sampling and Creative Curriculum reported assessment outcomes for FY09 online, data was submitted in additional formats Compiling the data for manual reporters will be time consuming across all tools, as hard data comes in multiple forms and includes program-level and individual child reports
18
Drawing Conclusions Original purpose of assessment tool use in UPK programs: Tools were developed to assist programs in improving and individualizing curriculum based on children’s developmental progress Assessment tools also help red flag children who may need further diagnostic screening Caveats related to drawing conclusions UPK programs expressed concern over the consistency of the data reported and how EEC would be evaluating such data (including whether it would be used to look at progress made by children) Many programs expressed a desire to provide anecdotal information about how their data was collected and why it looks the way it does if it were to be analyzed by EEC
19
Drawing Conclusions A statement could be made about a “snapshot” of where UPK children were in spring Programs were only required to submit data for spring 2009, though multiple checkpoint data is available from a minimal number of programs There is significantly more spring data than data from both fall and spring Broad statements could be made about progress children made in UPK programs It would be difficult to say this progress is due to their experience in the program The data collected will not allow us to compare UPK children to children in other programs Note: With more clear communication to grantees about data reporting in spring and fall, much of this inconsistency related to having program data for one or two checkpoints could be eliminated for FY10.
20
Next Steps
21
Points to Consider What are other states doing now and can we learn from this? Should the state require programs to use one tool or continue to allow choice from four separate tools? Should other tools be considered? What does proficiency across domains look like for children? Are there certain results that are considered “good?” As measured by these tools, are particular domains more predictive of later outcomes? What do the assessment tools tell us about child progress and what is it specifically we want to measure?
22
Work Sampling in Other States
Maryland Maryland uses Work Sampling modified for the Department of Education to assess the readiness of children in language, literacy, and math concepts when entering kindergarten Kindergarten teachers assess children’s proficiency in developmental domains when they enter Kindergarten The percentage deemed proficient is reported annually as a measure of children’s school readiness
23
Work Sampling in Other States
Pennsylvania Pre-K Counts programs required to use Work Sampling online and report to state 3 times per year Data is used to illustrate progress made by all children in Pre-K Counts programs State has a supplementary data system, Early Learning Network, where programs enter teacher, classroom, and family reporting information Arkansas ABCSS children are assessed annually using an instrument which correlates to the Arkansas Early Childhood Frameworks Assessment facilitates two important goals: Classroom planning Program Accountability
24
Measuring Long Term Outcomes in Arkansas
Child Assessment Tools: Peabody Picture Vocabulary Test - 3 Preschool Comprehensive Test of Phonological and Print Processing (Pre-CTOPP) Woodcock-Johnson Tests of Achievement Social Skills Rating System (Gresham & Elliot, 1990) Work Sampling Children assessed and compared using those instruments which measure: Vocabulary Phonics Math skill development Social development Arkansas uses these tool to assess summative progress of a sample of all children in the state
25
Massachusetts Kindergarten Assessments
Massachusetts kindergartens use a variety of tools to measure children’s school readiness when children enter kindergarten Many kindergartens also have program-level assessments and the following is a breakdown of how this information is used: Plan/adapt curriculum % Communicate with parents % Plan classroom activities % Identify children for referral to special education 91% Share with first grade teachers or others % Inform and complete children’s progress reports 90% Share with teachers who are working with child 89% Determine areas for more training 73% Other (i.e. Title I, to support differentiated instruction) In 2006, the ESE sampled kindergartens and based on responses from 127 programs determined that these programs used program level assessments to improve classroom activities and identify children’s progress. Slightly more than 30% of kindergartens used one of the four UPK assessment tools.
26
One Tool vs. Multiple Tools for Program-Level Assessments
Pros: Allows for consistent data reporting, which is more easily monitored and analyzed by EEC Cons: Requires many programs to switch tools (which will result in retraining and startup costs) or use a tool that may not best need their needs Multiple tools: Pros: Allows programs the flexibility to choose the best tool for their program type and curriculum philosophy Cons: Results in data coming from four different sources which can not be aligned with each other
27
Proficiency Across Domains
Data provided by the program-level assessment tools do not tell us what is considered “good” proficiency, as this is not the purpose of the tools Publishers of the tools do not share data from other programs, states, etc., so there is not a solid benchmark to which data can be compared One state, Maryland, has devised its own scoring system based on Work Sampling to define if children are proficient
28
Predictability of Work Sampling for Later Outcomes
Maryland uses Work Sampling (WS) modified for the Department of Education to assess the readiness of children in language, literacy, and math concepts when entering kindergarten A recent study validated WS against other measures in three domains (language, literacy, and mathematics) Arkansas uses data from WS as well as a number of other validated and reliable measures to assess children’s school readiness by sampling all children in the state (this is separate from using work sampling in programs as a formative program level assessment tool) Pennsylvania only uses WS to determine broad proficiency of their Pre-k Counts children at the program level
29
Appendix A: Data Submitted by Assessment Tool
30
Data Submitted by Assessment Tool
For the tables to follow, the number of UPK children with data is a combination of actual data submitted and an average of 17 preschool children per UPK classroom and 3 preschool children per UPK family child care home. Table 3: Creative Curriculum: Number of UPK Programs, Classrooms and Children with Data by Data Type Number of Programs Number of Classrooms/Homes Number of UPK Children Data Type Online License* 78 158 3,028 Program-level Report 21 36 472 Individual Child Reports 3 4 57 Total 102 198 3,558 *The number of children in the online license for creativecurriculum.net is 3,028 for the spring checkpoint. 2,096 children have data entered for both fall and spring.
31
Data Submitted by Assessment Tool, Continued
Table 4: High Scope COR: Number of UPK Programs, Classrooms and Children with Data by Data Type Number of Programs Number of Classrooms/Homes Number of UPK Children Data Type Online License 3 5 85 Program-level Report 14 29 406 Individual Child Reports 2 42 Total 19 37 533 *Four programs are using earlylearner.net as their online system and have submitted program-level reports generated from that particular online database, which does not align completely with onlinecor.net.
32
Data Submitted by Assessment Tool, Continued
Table 5: Work Sampling: Number of UPK Programs, Classrooms and Children with Data by Data Type Number of Programs Number of Classrooms/Homes Number of UPK Children Data Type Online License* 36 72 885 Program-level Report 1 3 48 Individual Child Reports 2 60 Total 38 77 993 *For the number of children in UPK classrooms, 885 reflects the number of children who have data in at least one checkpoint out of three.
33
Data Submitted by Assessment Tool, Continued
Table 6: Ages & Stages: Number of UPK Programs, Classrooms and Children with Data by Data Type* Number of Programs Number of Classrooms/Homes Number of UPK Children Data Type Online License - Program-level Report Individual Child Reports 44 Total *Providers using Ages & Stages were required to submit an assessment for one preschool-aged child in the home.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.