Download presentation
Presentation is loading. Please wait.
Published byChristiana Mitchell Modified over 9 years ago
1
Assessment, Evaluation, and Programming System for Infants and Children (AEPS™)-AEPSi Part 1 KDEC: 2013 Nancy Miller, M.Ed., OTR/L Stephanie Parks, Ph.D., OTR/L
2
Authentic assessment AEPS Overview (brief) Fidelity in Assessment Scoring, Team Consensus, and Efficiency Pitfalls to Fidelity Resources
3
The root of the word assessment is assidere, which means “to sit beside and get to know.” Authentic Assessment
4
o Familiar people…. o In familiar settings… o With familiar objects/toys… o Doing familiar things. Adapted from: Sophie Hubbell, M.A.T Kent State University
5
Assessment Evaluation and Programming System (AEPS) for Infants and Children ( Second Edition )
7
AEPS Interactive (AEPSi) Secure, web-based tool Allows for easy means to record, score, track, aggregate, archive, and report on the results of the AEPS Test http://aepsinteractive.com http://www.aepsi.com
8
How are you currently using AEPS (CBA) Screening Eligibility Program Planning Progress Monitoring Program Evaluation
9
What is the AEPS It is a comprehensive system that ties together assessment, goal development, intervention, and ongoing monitoring and evaluation.
10
It is Criterion-referenced Curriculum-based Domain Specific Developmental Can be used to corroborate eligibility decisions Programmatic: can help you determine priority goal areas and focus your interventions
11
linked system framework
12
This Not This
13
AEPS DOMAINS
14
AREA STRAND A STRAND B Goal 1 Goal 2 Goal 3 Obj. 2.1 Obj. 2.3Obj. 1.2Obj. 2.2 Obj. 1.1Obj. 2.2 Obj. 1.2 Obj. 3.1Obj. 2.1Obj. 1.1 Obj 2.3Obj. 3.2Obj. 1.3 Strands: Easy to More Difficult Goals: Easy to More Difficult Objectives become more difficult as the goal is approached. © Jennifer Grisham-Brown, Kentucky Early Childhood Data Project, 2007 Organizational Structure of AEPS test items
15
“One mark of excellent teachers is the frequency with which they evaluate and change children’s instructional programs continually adapting them to meet the needs of each child.” Bailey and Wolery, 1992
16
FIDELITY Evidence is needed that an assessment such as a CBA [AEPS] is administered, summarized, interpreted, and used in the way that it was designed, intended and validated. Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co.
17
ONGOING Observation (PREFERRED) Within routines/activities Direct Test* Report Collecting Assessment and Evaluation Information with Fidelity-Scoring *Note: Scoring guidelines when using the Direct Test method are not the same as the Observation guidelines. Refer to page 47 in Volume 1.
18
2 = Consistently meets criterion 1 = Inconsistently meets criterion; emerging 0 = Does not meet criterion; currently does not exhibit the skill; (in preschool aged child-may not yet be expected) Scoring Guidelines: Observation or Direct Test
19
From: Bricker, D. (2002). Assessment, evaluation, and programming system for infants and children. Baltimore: Paul H. Brookes Publishing Co. Observation
20
From: Bricker, D. (2002). Assessment, evaluation, and programming system for infants and children. Baltimore: Paul H. Brookes Publishing Co. DIRECT TESTING
21
Scoring with Fidelity …. It’s not just a 0, 1, 2
22
Use of Notes and Comments to enhance your assessment data A = assistance provided (1 or 0) B = behavior interfered (1 or 0) D = direct test (2, 1, 0) M = modification/adaptations (2, 1, 0) Q = quality of performance (2, 1) R = report (2, 1, 0)
23
From: Bricker, D. (2002). Assessment, evaluation, and programming system for infants and children. Baltimore: Paul H. Brookes Publishing Co.
27
Last but not least… Your written comments to quantify, describe, enhance the observations you have made.
29
Team Consensus Builds Scoring Fidelity
30
Team Discussions What do these scores mean to you? Consistently/Inconsistently The power of the Notes section With assistance Modifications/Adaptations Team use of the Comment Section Under what conditions/strategies etc. Team responsibilities discussion
31
Don’t forget the Family Report 2 Levels (birth to three and three to six) 2 sections Family Routines Often done through conversation/interview Family Observations Scored: Yes, Sometimes, Not Yet
32
Consider using the second section of the Family Report with Community and Daycare Providers since the skills parallel the AEPS CODRF across the developmental domains. Score as “Report”
33
Ways to gather and document Individually CODRF Group Group routine/activity matrix Family Report I and II (Spanish version available)
36
Assessment Activity Plans AEPS comes with 12 pre- written activities to assess a variety of children across developmental areas (see Volume 2) OR you can create your own that parallel existing planned activities or those provided in the AEPS. Adapted from: Sophie Hubbell, M.A.T Kent State University
37
Administering the AEPS with Groups of Children
38
AEPSi Assessment Activities: Center-based (Level 1 and Level 2) Book About Me Classroom Transitions and Routines Dramatic Play Meals and Snack Story Time Playdough and Manipulatives Outdoor Play Conversation with Familiar Adults Jennifer Grisham-Brown, Kentucky Early Childhood Data Project, 2007
39
AEPSi Assessment Activities: Routine-Based (Level 1) Rough & Tumble Quiet Time Mystery Box Feeding & Meals Daily Transitions & Routines Conversations with Caregivers Busy Hands © Jennifer Grisham-Brown, Kentucky Early Childhood Data Project, 2007
41
Reasons why CBAs are not implemented with fidelity Teachers may find the actual implementation of a CBA to be overwhelming (particularly in classrooms where large amounts of data have to be collected on many children). Sometimes teachers lack the training or support to administer the CBA. (teachers may become frustrated and implement the assessment with low fidelity) Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co.
42
REMEDIES: use the authentic assessment fidelity measure by Grisham-Brown and colleagues (2008) develop a fidelity measure that relates to procedures of the assessment being used by your program Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co. Pitfall # 1: Most ECI assessments, do not include administration checklists that can be used for integrity/fidelity checks.
43
http://www.ehhs.kent.edu/ceecrt/index.php/research/current
44
REMEDIES: utilize a coaching system in which teachers check on another and provide support for those who are new to using the assessment. engage in ongoing professional development to ensure accuracy in scoring and use and avoid drift over time. Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co. Pitfall #2: Teams administer assessments without sufficient training or ongoing support
45
Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co. “Regardless of how information is gathered, what information is gathered, or even how the information is summarized, if it isn’t used to plan and guide instruction, then the process is a waste of the teacher’s time and provides no advantage for young children.” (p, 170)
48
http://www.kskits.org /ta/Packets/embedAss essment.shtml KITS/TASN TA Packet: Embedding Assessment into Daily Activities and Routines
49
http://aepsblog.blogspot.com/ AEPS BLOG: Screencast Series
50
http://www.cde.state.co.us/resultsmatter/R MVideoSeries_PracticingObservation.htm#t op Colorado Department of Education: Results Matter
51
http://eclkc.ohs.acf.hhs.gov/hslc/tta- system/teaching National Center on Quality Teaching and Learning NCQTL
52
http://eclkc.ohs.acf.hhs.gov/hslc/tta- system/teaching/center/practice/ISS NCQTL
53
http://www.aepsinteractive.com/
54
How can we collaborate and share across the state?
56
Nancy Miller, M.Ed., OTR/L nmiller@bluevalleyk12.org Stephanie Parks, Ph.D., OTR/L slparks@bluevalleyk12.org
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.