Download presentation
Presentation is loading. Please wait.
Published byTodd Dorsey Modified over 7 years ago
1
A Comprehensive Early Childhood Screening & Assessment System
2
Early Childhood assessment: Best Practices
Part I Early Childhood assessment: Best Practices
3
What comes to mind when you hear the word assessment?
Why do we assess children? Opening discussion
4
Assessment Defined … a process of gathering information about a child (or group of children) for the purpose of making decisions …. Ask – What decisions are made based on the information gathered in assessment?
5
Wisconsin Model Early Learning Standards: Assessment Considerations
Young children learn in ways and at rates different from older children. Young children come to know things through doing as well as through listening and often represent their knowledge better by showing than by telling. Discussion: agree? Ask for example to support these claims.
6
Wisconsin Model Early Learning Standards: Assessment Considerations
3. Young children’s development and learning is rapid, uneven, and episodic, so that point-in-time assessments do not give a complete picture of their learning. 4. Young children’s achievements are the result of a complex mix of their ability to learn and their past learning opportunities. Discussion: #3 above. agree? Discuss pros and cons of “point-in-time” assessment data vs. on-going assessment data.
7
WHY we assess children - 1
A. To monitor children’s development and learning Screening - A process - including administration of a quick, valid and reliable screening tool or tools - to guide decisions about “next steps” to address individual children’s needs Developmental Academic Stress that you will be describing 4 purposes of assessment; 4 reasons why we assess ASQ and ASQ:SE – developmental screening tools widely used in WI PALS PreK – provides formative data – its use is to inform instruction
8
WHY we assess children - 1
B. To monitor children’s development and learning on-going assessment; data collected over time multiple methods often anchored to a criterion-referenced assessment tool can increase in intensity or frequency when used with ‘interventions’ or ‘additional challenges’ Over time = ongoing, daily, authentic assessment – observation, work samples Anchored to a tool – as with TS Gold - gather authentic data over time, then at specific checkpoints analyze your data, compare them to the criteria for rating on the scale, and enter the rating (this becomes BENCHMARK data when you do this – it shows child progress over time compared to the expectation.)
9
WHY we assess children - 2
Eligibility – to determine if a child might benefit from special education (Birth to 3 or Early Childhood Special Education) “Evaluation” – more in-depth assessment by a team of specialists, including the classroom teacher and family Can lead to “placement”, an IFSP or IEP, and related services Self-explanatory
10
WHY we assess children - 3
To guide our planning and decision making Curriculum decisions/what to teach Thematic units/projects/lesson plans For individual children and groups of children third bullet – to plan for universal instruction – what all kids get – and for individual children – those who need additional supports (based on our data) or additional challenges (advanced – already knows what I’m teaching)
11
WHY we assess children - 3
To guide our planning and decision making Information used to decide WHAT & HOW to teach the child Assessment Implementation Program Planning WI Model Early Learning Standards Teaching Cycle From WI Early Learning Standards; Key to understanding the role of assessment in teaching; it is an on-going cycle of assessing for the purpose of planning, then teaching/implementing the plan, then assessing children's learning.
12
WHY we assess children - 4
To report to or communicate with others Accountability Family conferences/report cards/progress reports State requirements (early literacy) Program evaluation – is the program achieving its intended outcomes? Last bullet: Child outcomes can be one element of program evaluation; many other variables mist be considered as well (see YoungStar at )
13
Systematic On-going Assessment
Is the child making progress toward milestones or learning goals? Often based on observations, work samples Portfolios Serves to continually guide teaching decisions (i.e. – ‘curriculum’) “Snap shot” vs. “movie” Can be “anchored” to a valid and reliable tool Last bullet – a “snap shot” is specific data you collect at a specific point in time. Viewing those ‘snap shots’ over time becomes a ‘movie’ – a running narrative of the child’s progress Some assessment tools only give you a snap shot (PALS for example). Still useful data, but different type of data.
14
HOW we assess children Principles of Ongoing Assessment
Assessment is Authentic Assessment is Based on Multiple Sources of information Assessment Information is Anchored to an Assessment Instrument the Facilitates the Interpretation of Progress Assessment is Systematic, Continuous and Guided by an Assessment Plan On a day to day basis – what practices guide how assessment is done on children? Using authentic assessment (show next slide as definition and come back to this slide) Gathering multiple sources for information; this includes gathering info FROM families, not just providing them with info 3. Anchoring assessment to a tool that facilitates the interpretation of progress over time (move forward 2 slides to Strategic Assessment Process to explain this principle). 4. Developing a planned, systematic and continuous assessment system (move 2 slides ahead to DPI chart, Strategic Assessment System)
15
Definitions --Authentic Assessment
“The best way to understand the development of children is to observe their behavior in natural settings while they are interacting with familiar adults over prolonged periods to time.”(Broffenbrenner, 1977) Functional assessment Performance assessment Authentic – functional – performance – used in different settings but basically all the same type of data. (go back to previous slide, refer to green box, #2)
16
Gathering documentation* Gathering documentation*
Strategic Assessment Process when anchoring authentic data to a valid and reliable assessment tool “anchor” to tool Gathering documentation* 1st checkpoint 2nd checkpoint 3rd checkpoint Gathering documentation* “anchor” to tool Beginning of year 4-6 weeks into year/term Midyear/ semester End of year/term Formative assessment data Self explanatory. Light grey arrows (left to right “Gathering Documentation”) represent on-going daily/weekly data collection, done before and between ‘checkpoints’ – times the teacher reviews the on-going data collected, compares it to the criteria (examples/descriptions) for a rating on the tool, and assigns a number (1-7). The first checkpoint is formative data – to plan instruction The second checkpoint initially provides benchmark data – it shows the child’s progress since the prior checkpoint and assigns a number that can be used for comparison purposes within a class (i.e. – most students earn a “5” on a specific objective – a couple earn a rating below that and a couple earn a rating above). The second checkpoint then can be used again as formative data – the teacher then plans universal instruction based on what the majority is ready for; she also plans for additional supports for those kids with lower ratings, and decides if challenging activities need to be provided for those who rate higher than the average in the class. The third checkpoint initially is benchmark data - – it shows the child’s progress since the prior checkpoint and assigns a number that can be used for comparison purposes within a class The third checkpoint data then can be used as summative data – to summarize a child’s learning over the year; to measure the class outcomes for the year; also can used as ONE SOURCE OF DATA to consider the overall effectiveness of the program RETURN TO SLIDE 13, Principles Ongoing Assessment, last box (turquoise)/last principle Benchmark assessment data AND Formative assessment data Benchmark assessment data AND Summative assessment data *Gathering documentation: Daily/weekly observation notes, photos, audio/video recordings, work samples, teacher-made tallies/rubrics, information from families and other service providers
17
Strategic Assessment System
By Type: What are the differences between assessment types within a strategic system? FORMATIVE INTERIM SUMMATIVE …assessments are designed to… Quickly inform instruction Benchmark and monitor progress Evaluate learning …by providing… Specific, immediate, actionable feedback Multiple data points across time Cumulative snapshots …through… Daily, ongoing instructional strategies Periodic diagnostic/common assessments Standardized assessments Consider an over all assessment plan as described in this chart. Allow time to study this guide. For more information, go to: Make the connection as demonstrated in the previous chart that authentic data can be used for multiple purposes. ...that are… Student/Classroom-centered Grade-level/School-centered School/District/State-centered …and that answer… What comes next for student learning? What progress are students making? Is the program working? Are our students meeting the standards? Wisconsin Department of Public Instruction
18
Early Childhood assessment: a Comprehensive System
Part II Early Childhood assessment: a Comprehensive System
19
A Comprehensive Early Childhood Screening & Assessment System
BLUEPRINT For a Comprehensive and Aligned System For Screening and Assessment of Young Children Prepared for Wisconsin Governor’s Early Childhood Advisory Council By the Wisconsin Early Childhood Collaborating Partners: Healthy Children Committee (Serving as the ECAC Screening and Assessment Project Team) February 2012 Wisconsin Governor’s Early Childhood Advisory Council (ECAC) Mission: Every child will be healthy, nurtured, safe and successful Conduct periodic needs assessments Identify cooperative/collaborative opportunities and barriers Develop recommendations to increase child/family participation Identify professional development needs Provide brief history of the Blueprint; who developed it and why; screening and assessment practices address all 4 bullet points. For more information/background on the Blueprint:
20
BLUEPRINT Key Points Birth – 5 years focus; across systems
Prevention, early intervention, and treatment important for improving child outcomes and … healthy families Developmentally appropriate and valid screening and assessment data are the cornerstone of informed decision making Collecting both universal and targeted screening and assessment data is efficient and effective Across systems – care and education, health, etc. Provide examples of ‘universal’ (PALS) and ‘targeted’ (checking blood for lead level of children who live in older housing)
21
A Comprehensive Early Childhood Screening & Assessment System
Child-level Program-level Systems/Institutions Visual of slide that follows – demonstrates 3 levels of practices that needs to be coordinated/aligned to create a comprehensive system as described by the Center for Enhancing Early Learning Outcomes (CEELO) at
22
A Comprehensive Early Childhood Screening & Assessment System
Child – all developmental domains – motor, social-emotional, language, approaches to learning, and cognition; plus content areas – literacy and math Program –comprehensive data on each child over time; teacher effectiveness, environmental and other program quality measures Systems - all who “touch the lives of young children and their families” - health care, education, Head Start, mental health, child care, home visiting and IDEA programs WHAT to assess/WHO is engaged
23
A Comprehensive Early Childhood Screening & Assessment System
Child level: monitoring development & learning; determining eligibility; planning for instruction and “next steps” for individual children and groups of children; communicating with families Program level:; accountability; assessing services/program quality and effectiveness; professional development Systems –Institutions level: policy development; resources allocation; professional development Purpose/uses of data at each level
24
A Comprehensive Early Childhood Screening & Assessment System
No examples of fully implemented comprehensive systems in US; still in developmental stages 2012 – of states that offered state-funded pre-K programs, 34 required assessments of children attending these programs - 19 states allowed districts to select their own tool(s) 9 provided a list to choose from 6 had mandated tool(s) Schilder, D. & Carolan, M. STATE OF THE STATES POLICY SNAPSHOT: STATE EARLY CHILDHOOD ASSESSMENT. CEELO, March 2014 National perspective
25
A Comprehensive Early Childhood Screening & Assessment System: Kindergarten Entrance Assessments (KEAs) Data collected after children begin kindergarten, within the first 2 months Assesses development in the 5 Essential Domains of School Readiness Aligned with state’s early learning standards Valid and reliable for its intended purposes A current trend in comprehensive systems often includes a KEA 5 Essential Domains of School Readiness – same as domains found in WI Early Learning Standards – Health and Physical Development; Social & Emotional Competence; Communication & Early Literacy; Approaches to Learning; Cognition & General Knowledge. FOR MORE INFORMAITON/RESOURCES ON KEAs, GO TO
26
To inform efforts to close the school readiness gap
A Comprehensive Early Childhood Screening & Assessment System: Kindergarten Entrance Assessments (KEAs) Overall purpose: To inform efforts to close the school readiness gap To inform instruction To inform parents and involve them in their children’s education NOT intended to prevent children’s entry into kindergarten Stress that a KEA is never intended to “sort” children into “ready“ and “not ready” groups; data collected AFTER children enter school; used to plan for instruction and supports
27
Selected Assessment System Examples from Other States
K-3 Formative Assessment Consortium North Carolina [lead], Arizona, Delaware, District of Columbia, Iowa, Maine, North Dakota, Oregon, Rhode Island, & South Carolina Working to design a formative assessment system that begins with a KEA and continues through 3rd grade Note this does not specifically include pre-K service providers, but begins at 5K (not consistent with Wisconsin’s Blueprint) FOR A CHART WITH LINKS TO STATE KEAs, GO TO
28
Selected Examples from Other States
California’s Desired Results System Administered by the Department of Education (CDE); applies to child care and development services for children, birth – 13 years, and their families Compatible with CDE's accountability system for elementary and secondary education Includes a KEA, Desired Results Developmental Profile – School Readiness (DRDP-SR) Coordinated with Desired Results: Access for Children with Disabilities (DR Access) Desired Results for birth-13 year olds and their families: Children are personally and socially competent. Children are effective learners. Children show physical and motor competence. Children are safe and healthy. Families support their children's learning and development. Families achieve their goals
29
Selected Examples from Other States
Washington’s public-private partnership Washington State Early Learning and Development Guidelines, birth through grade 3 (K-3 jointly with CCSS to include social-emotional domain) Washington’s Kindergarten Inventory of Developing Skills (WaKIDS) considered a process for school readiness, transition into 5K Includes families; 5K teachers; and child care, Head Start, & other pre-K service providers TS Gold required at 5K entrance as KEA; pre-K providers encouraged to use TS Gold; TS Gold alignment guide provided for other tools Shared child data across systems to support smooth transitions to 5K and continuous teaching and learning Public-private partnership members: Department of Early Learning (DEL), the Office of Superintendent of Public Instruction (OSPI) and Thrive by Five Washington
30
Selected Examples from Other States
New Jersey PreK-grade 3 model targeting 31 high poverty districts Requires collaboration plan with preschools, kindergartens, and elementary schools to share individual child data via portfolios TS Gold used as ‘KEA checkpoint’ FOR A CHART WITH LINKS TO STATE KEAs, GO TO
31
Selected Examples from Other States
Georgia’s ‘Bright from the Start’ Birth to age 5 focus (includes universal Pre-K) Early learning standards aligned with K-3 state standards Online Work Sampling System to share child data as they transition into 5K Georgia Kindergarten Inventory of Developing Skills (GKIDS) - ongoing, year-long assessment to determine student skills entering and exiting 5K (KEA) FOR A CHART WITH LINKS TO STATE KEAs, GO TO
32
Closing thoughts … Child assessment involves multiple sources of information, including valid and reliable tools “Screening” is linked to appropriate follow-up Assessment is an on-going, continuous process Families are contributors in the process Procedures are developmentally appropriate (age/individual/cultural-linguistic) Effective training/professional development opportunities are provided Child assessment is linked to a larger system Overall purpose is to improve child outcomes SELF EXPLANATORY
33
to have a more comprehensive system?
Re-consider how and why we assess children. What’s needed for YOU to have a more comprehensive system? Self explanatory
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.