Assessment, Evaluation, and Programming System for Infants and Children (AEPS™)-AEPSi Part 1 KDEC: 2013 Nancy Miller, M.Ed., OTR/L Stephanie Parks, Ph.D.,

Slides:



Advertisements
Similar presentations
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Advertisements

GAPBS Annual Conference Presented By Cynthia Vail, PhD, University of Georgia Katy Gregg, PhD, Georgia Southern University Rebecca Sartor, MEd, Clarke.
Educational Specialists Performance Evaluation System
Observing Children: A Tool for Assessment
Intro. Website Purposes  Provide templates and resources for developing early childhood interagency agreements and collaborative procedures among multiple.
Using Assessment to Inform Instruction: Small Group Time
Ideas from the Outcomes Think Tank. Gather family’s concerns and general information about child following program procedures Use 3 global outcomes as.
Embedding Interventions and Monitoring Progress Kristie Pretti-Frontczak Kent State University September 2007.
Assessment, Evaluation, and Programming System ® Presenters: Pam Elwood, GHAEA Early Childhood Consultant Melanie Reese, GWAEA Early.
A SSESSMENT Sophie Hubbell, MAT Kent State University October 2009.
1 Overview of a Curriculum Framework for ALL Children Jennifer Grisham-Brown, EdD University of Kentucky
1 Assessing Young Children in Inclusive Settings: The Blended Practices Approach Jennifer Grisham-Brown, Ed.D University of Kentucky Implications.
Unit 5 – Planning and Integrating: Key Topic 1 1.
Comprehensive Curriculum Framework for Tiered Instruction: A Response to Intervention Model Sarah Jackson, M.Ed. Sandra Hess Robbins, M.Ed. Sanna Harjusola-Webb,
Essential Elements in Implementing and Monitoring Quality RtI Procedures Rose Dymacek & Edward Daly Nebraska Department of Education University of Nebraska-
Observing Children: A Tool for Assessment
© 2013, 2009, 2006, 2003, 2000 Pearson Education, Inc. All rights reserved. William L. Heward Exceptional Children An Introduction to Special Education.
1 Supporting Striving Readers & Writers: A Systemic Approach United States Department of Education Public Input Meeting - November 19, 2010 Dorothy S.
Research to Practice: Implementing the Teaching Pyramid Mary Louise Hemmeter Vanderbilt University
Assessment for ASD Programming November 2012IDEA Partnership1.
Supporting Children with Challenging Behaviors Refresher Training.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
Assessment for Child Monitoring and Program Planning: Creating Strong Links June 11, 2007JJ June 11, 2007 Lexington, KY First Steps Biannual PLE Meeting.
Building Blocks for Including and Teaching Preschoolers with Special Needs Susan Sandall, Ph.D. University of Washington
1 Enhancing Services in Natural Environments Presenter: Mary Beth Bruder March 3, :00- 2:30 EST Part of a Web-based Conference Call Series Sponsored.
Early Childhood Inclusion at the Frank Porter Graham Child Care Program: A Collaborative and Routines-Based Approach.
WestEd.org Formative Assessment in Context Panelist: Peter L. Mangione, Ph.D. WestEd CEELO RoundTAble Using Assessment to Improve Teaching & Learning for.
Chase Bolds, M.Ed, Part C Coordinator, Babies Can’t Wait program Georgia’s Family Outcomes Indicator # 4 A Systems Approach Presentation to OSEP ECO/NECTAC.
Classroom Assessments Checklists, Rating Scales, and Rubrics
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
ND Early Childhood Outcomes Process Nancy Skorheim – ND Department of Public Instruction, Office of Special Education.
Early Intervention Support for Children and Families.
School Readiness: We’re Better Together
Progress Monitoring in Early Childhood: Generating a Discussion Judy Carta, Juniper Gardens, University of Kansas Nan Vendegna, Colorado Results Matter.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Chapter 2 Observation and Assessment
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
Introduction Gathering Information Observation Interviewing Norm Referenced Tools Scores Administering Why, What, How Learning Check 5 Authentic Assessment.
Early Reading First. The ultimate goal of Early Reading First is to close the achievement gap by preventing reading difficulties.
Goals of Transition Plans  To ensure continuity of services  To minimize disruptions to the family system by facilitating adaptation to change  To ensure.
Issues in Selecting Assessments for Measuring Outcomes for Young Children Issues in Selecting Assessments for Measuring Outcomes for Young Children Dale.
Introduction Gathering Information Observation Interviewing Norm Referenced Tools Authentic Assessment Characteristics of Authentic Assessment – 7M’s Validity.
Planning and Integrating Curriculum: Unit 4, Key Topic 3http://facultyinitiative.wested.org/1.
Blended Practices for Teaching Young Children in Inclusive Settings Jennifer Grisham-Brown, Ed.D. Mary Louise Hemmeter, Ph.D.
We worry about what a child will be tomorrow, yet we forget that he is someone today. --Stacia Tauscher.
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
AN OVERVIEW OF THE CHILD OUTCOMES SUMMARY RATING PROCESS 1 Maryland State Department of Education - Division of Special Education/Early Intervention Services.
Introduction to the Framework: Unit 1, Getting Readyhttp://facultyinitiative.wested.org/1.
Section 1. Introduction Orientation to Virginia’s QRIS.
Introduction to the Framework: Unit 1, Getting Readyhttp://
Linking the DRDP to Instruction: Using the DRDP (2015) Reports
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
By: Jill Mullins. RtI is… the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time and.
PROFESSOR KERI MCCORVEY M. CCC-SLP PROFESSOR KERI MCCORVEY M. CCC-SLP Seminar Unit 3 Identification and Early Intervention.
RTI 101 Jon Potter Dean Richards Oregon RTI Project.
Johnson, J., Rahn, N., and Bricker, D., 2015 ECSE 672 Fall 2015 ACTIVITY-BASED APPROACH TO EARLY INTERVENTION.
Observing and Assessing Young Children
Resources for Delivering Professional Development on Blended Practices © 2015 Division of Early Childhood. All rights reserved.
Facilitator: Angela Kapp Authentic Assessment Session 1 Session 1 Level 2 Minnesota Department of Human Services.
Evaluating activities intended to improve the quality of Child Outcomes Data August 2016.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Child Outcomes Summary Process April 26, 2017
ECSE 601 Convergent Assessment Project
Using Formative Assessment
Early Childhood Inclusion at the Frank Porter Graham Child Care Program: A Collaborative and Routines-Based Approach.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Observing Children: A Tool for Assessment
Assessment, Evaluation, and Programming System (AEPS®)
Developmental Reading Assessment
Observing Children: A Tool for Assessment
Presentation transcript:

Assessment, Evaluation, and Programming System for Infants and Children (AEPS™)-AEPSi Part 1 KDEC: 2013 Nancy Miller, M.Ed., OTR/L Stephanie Parks, Ph.D., OTR/L

Authentic assessment AEPS Overview (brief) Fidelity in Assessment Scoring, Team Consensus, and Efficiency Pitfalls to Fidelity Resources

The root of the word assessment is assidere, which means “to sit beside and get to know.” Authentic Assessment

o Familiar people…. o In familiar settings… o With familiar objects/toys… o Doing familiar things. Adapted from: Sophie Hubbell, M.A.T Kent State University

Assessment Evaluation and Programming System (AEPS) for Infants and Children ( Second Edition )

AEPS Interactive (AEPSi) Secure, web-based tool Allows for easy means to record, score, track, aggregate, archive, and report on the results of the AEPS Test

How are you currently using AEPS (CBA) Screening Eligibility Program Planning Progress Monitoring Program Evaluation

What is the AEPS It is a comprehensive system that ties together assessment, goal development, intervention, and ongoing monitoring and evaluation.

It is Criterion-referenced Curriculum-based Domain Specific Developmental Can be used to corroborate eligibility decisions Programmatic: can help you determine priority goal areas and focus your interventions

linked system framework

This Not This

AEPS DOMAINS

AREA STRAND A STRAND B Goal 1 Goal 2 Goal 3 Obj. 2.1 Obj. 2.3Obj. 1.2Obj. 2.2 Obj. 1.1Obj. 2.2 Obj. 1.2 Obj. 3.1Obj. 2.1Obj. 1.1 Obj 2.3Obj. 3.2Obj. 1.3 Strands: Easy to More Difficult Goals: Easy to More Difficult Objectives become more difficult as the goal is approached. © Jennifer Grisham-Brown, Kentucky Early Childhood Data Project, 2007 Organizational Structure of AEPS test items

“One mark of excellent teachers is the frequency with which they evaluate and change children’s instructional programs continually adapting them to meet the needs of each child.” Bailey and Wolery, 1992

FIDELITY Evidence is needed that an assessment such as a CBA [AEPS] is administered, summarized, interpreted, and used in the way that it was designed, intended and validated. Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co.

ONGOING Observation (PREFERRED) Within routines/activities Direct Test* Report Collecting Assessment and Evaluation Information with Fidelity-Scoring *Note: Scoring guidelines when using the Direct Test method are not the same as the Observation guidelines. Refer to page 47 in Volume 1.

2 = Consistently meets criterion 1 = Inconsistently meets criterion; emerging 0 = Does not meet criterion; currently does not exhibit the skill; (in preschool aged child-may not yet be expected) Scoring Guidelines: Observation or Direct Test

From: Bricker, D. (2002). Assessment, evaluation, and programming system for infants and children. Baltimore: Paul H. Brookes Publishing Co. Observation

From: Bricker, D. (2002). Assessment, evaluation, and programming system for infants and children. Baltimore: Paul H. Brookes Publishing Co. DIRECT TESTING

Scoring with Fidelity …. It’s not just a 0, 1, 2

Use of Notes and Comments to enhance your assessment data A = assistance provided (1 or 0) B = behavior interfered (1 or 0) D = direct test (2, 1, 0) M = modification/adaptations (2, 1, 0) Q = quality of performance (2, 1) R = report (2, 1, 0)

From: Bricker, D. (2002). Assessment, evaluation, and programming system for infants and children. Baltimore: Paul H. Brookes Publishing Co.

Last but not least… Your written comments to quantify, describe, enhance the observations you have made.

Team Consensus Builds Scoring Fidelity

Team Discussions What do these scores mean to you? Consistently/Inconsistently The power of the Notes section With assistance Modifications/Adaptations Team use of the Comment Section Under what conditions/strategies etc. Team responsibilities discussion

Don’t forget the Family Report 2 Levels (birth to three and three to six) 2 sections Family Routines Often done through conversation/interview Family Observations Scored: Yes, Sometimes, Not Yet

Consider using the second section of the Family Report with Community and Daycare Providers since the skills parallel the AEPS CODRF across the developmental domains. Score as “Report”

Ways to gather and document Individually CODRF Group Group routine/activity matrix Family Report I and II (Spanish version available)

Assessment Activity Plans AEPS comes with 12 pre- written activities to assess a variety of children across developmental areas (see Volume 2) OR you can create your own that parallel existing planned activities or those provided in the AEPS. Adapted from: Sophie Hubbell, M.A.T Kent State University

Administering the AEPS with Groups of Children

AEPSi Assessment Activities: Center-based (Level 1 and Level 2) Book About Me Classroom Transitions and Routines Dramatic Play Meals and Snack Story Time Playdough and Manipulatives Outdoor Play Conversation with Familiar Adults Jennifer Grisham-Brown, Kentucky Early Childhood Data Project, 2007

AEPSi Assessment Activities: Routine-Based (Level 1) Rough & Tumble Quiet Time Mystery Box Feeding & Meals Daily Transitions & Routines Conversations with Caregivers Busy Hands © Jennifer Grisham-Brown, Kentucky Early Childhood Data Project, 2007

Reasons why CBAs are not implemented with fidelity Teachers may find the actual implementation of a CBA to be overwhelming (particularly in classrooms where large amounts of data have to be collected on many children). Sometimes teachers lack the training or support to administer the CBA. (teachers may become frustrated and implement the assessment with low fidelity) Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co.

REMEDIES: use the authentic assessment fidelity measure by Grisham-Brown and colleagues (2008) develop a fidelity measure that relates to procedures of the assessment being used by your program Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co. Pitfall # 1: Most ECI assessments, do not include administration checklists that can be used for integrity/fidelity checks.

REMEDIES: utilize a coaching system in which teachers check on another and provide support for those who are new to using the assessment. engage in ongoing professional development to ensure accuracy in scoring and use and avoid drift over time. Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co. Pitfall #2: Teams administer assessments without sufficient training or ongoing support

Grisham-Brown, J., & Pretti-Frontczak, K. (2011). Assessing Young Children in Inclusive Settings: The Blended Practices Approach. Baltimore: Paul H. Brookes Publishing Co. “Regardless of how information is gathered, what information is gathered, or even how the information is summarized, if it isn’t used to plan and guide instruction, then the process is a waste of the teacher’s time and provides no advantage for young children.” (p, 170)

/ta/Packets/embedAss essment.shtml KITS/TASN TA Packet: Embedding Assessment into Daily Activities and Routines

AEPS BLOG: Screencast Series

MVideoSeries_PracticingObservation.htm#t op Colorado Department of Education: Results Matter

system/teaching National Center on Quality Teaching and Learning NCQTL

system/teaching/center/practice/ISS NCQTL

How can we collaborate and share across the state?

Nancy Miller, M.Ed., OTR/L Stephanie Parks, Ph.D., OTR/L