Presentation is loading. Please wait.

Presentation is loading. Please wait.

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Measurement Purgatory or Best Practice? Alternate Assessment for Students with Significant.

Similar presentations


Presentation on theme: "2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Measurement Purgatory or Best Practice? Alternate Assessment for Students with Significant."— Presentation transcript:

1 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Measurement Purgatory or Best Practice? Alternate Assessment for Students with Significant Cognitive Disabilities Don Peasley, Ohio Department of Education Tom Deeter, Iowa Department of Education Rachel Quenemoen, NCEO

2 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Overview What is required for alternate assessments on alternate achievement standards (AA-AAS) in the context of the 1% Rule? (and last Saturday’s presession) What is required for AA-AAS in the context of Title I Peer Review? Where are we now, and where do we have to go?

3 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Alternate Assessments as defined in “1% Rule”  Aligned with the State’s grade level content standards.  Yield results separately in reading/language arts and math.  Designed and implemented to support use of the results to determine AYP.

4 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Alternate Assessments should have…  Clearly defined structure  Guidelines for which students may participate  Clearly defined scoring criteria and procedures  Report format that clearly communicates student performance in terms of the academic achievement standards defined by the State

5 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Alternate Assessments Must meet the same requirements for high technical quality that apply to regular assessments under NCLB:  Validity  Reliability  Accessibility  Objectivity  Consistent with nationally-recognized professional and technical standards.

6 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen States may use more than one alternate assessment  Alternate assessment scored against grade-level achievement standards  Alternate assessment scored against alternate achievement standards  Both must support access to grade level curriculum

7 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Development of Alternate Assessments Quenemoen, Rigney, & Thurlow, 2002 1. Careful stakeholder and policymaker development of desired student outcomes for the population, reflecting the best understanding of research and practice, thoughtfully aligned to same content expected for all students, at grade- level. 2. Careful development, testing, and refinement of assessment methods. 3. Scoring of evidence of grade-level content aligned student work, according to professionally accepted standards, against criteria that reflect best understanding from research and practice. 4. Standard-setting process to allow use of results in reporting and accountability systems. 5. Continuous improvement of the assessment process.

8 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Observation Cognition Interpretation The assessment triangle (Pellegrino et al., 2001)

9 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Professional Understanding of Learning Goals Shifting goals for students with significant cognitive disabilities since 1975 (Browder, 2001; Kearns & Kleinert, 2004)  Developmental Goals – “ready meant never”  1980s - Functional Goals – NOW WE HAVE REFOCUSED ON:  1990s - Academic Goals – “general curriculum” leading to developmental traps leading to a focus on GRADE LEVEL Academic Content Standards

10 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen WHAT IS LEARNING? We must ensure all students have access to and make progress in the academic grade level content and assess achievement on that content What is achievement? What is proficiency?

11 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Title I Peer Review Checklist (MSRRC)

12 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Draft Technical Manual Outline Section I—Assessment Development A. Overview Principles guiding development Partners and process guiding development Research base on desired outcomes for this population, clarification of theory of learning – develop draft performance level descriptors Documentation of state conceptualization for (expansion/extension) alignment and access to the state grade level content standards Pros and cons of assessment methods considered Description of selected approach

13 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen TASK: Write draft performance level descriptors for AA-AAS Charlie DePascale, Jeff Nellhaus, Barbara Plake, Michael Beck session on Monday – nciea.org – basic information on standard-setting Depth of understanding? Differ in substance? Differ in amount? All the content? Some of the content? Any of the content? What does it mean for these students to be proficient in mathematics? In ELA? Are we avoiding developmental traps?

14 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen B. Test Development Protocol for alignment to grade level content standards Development of draft assessment protocol Pilot test design and results Field test design and results C. Test blueprint English Language Arts content specifications Mathematics content specifications Other (e.g., Science) content specifications

15 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Section II—Test Administration A. Procedures for administration Decision-making process (participation, IEP team role) Local responsibility Timelines B. Training Test oversight training for administrators Educator training for those working directly with students Ethical test administration training

16 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Ohio’s Alternate Assessment for Students with Disabilities The Ohio Alternate Assessment is based on a Collection of Evidence COE model Designed to be a measure of student achievement aligned with Ohio’s Academic Content Standards Alternate assessment is a “snapshot” of achievement during a window of time

17 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Collection of Evidence Cover page Entry 1 (Standard) Entry 2 (Standard) Entry 3 (Standard) For each academic area

18 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Ohio’s Participation Decision Framework: NO Does the student have a disability that presents “unique and significant” challenges to participation in district and state assessment regardless of the accommodations they could use? Participation in regular district and state assessments with or without accommodations Does the student have severe motor or sensory or cognitive or emotional disabilities? NO YES Continue……

19 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Ohio’s Decision Framework: Does the student: Require substantial modifications to the general education curriculum (form and substance)? AND Require instruction focused on the application of state standards through essential life skills? AND Require instruction multiple levels below age/grade level? AND Is the student unlikely to provide valid and reliable measure of proficiency in content areas via standardized assessment even with accommodations? Participation in regular district and state assessments with or without accommodations Student participates in Alternate Assessment NO YES

20 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Section III— Scoring and Reporting A. Scoring design Quality control Benchmarking Selecting and training scorers Scoring activities Inter-scorer reliability

21 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Ohio’s Alternate Assessment for Students with Disabilities Scoring Collection of Evidence scored across four domains (scoring criteria) Performance—holistic by entry Independence/Support-holistic by entry Context/Complexity—holistic by entry Settings and Interactions—for entire collection Evidence is scored independently according to professionally accepted standards by scoring contractors

22 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

23

24

25 B. Standard-setting Documented and validated process used for standard setting (Full description in Appendix _) Performance level descriptors and exemplars for alternate achievement standards Distribution of performance across levels Comparison of performance across levels achieved in general assessment by students with disabilities in comparable implementation years C. Reporting design School/District/State Report Parent Letter/Individual Student Report

26 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Ohio Results, Grade 3 Reading Achievement, March 2004

27 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Ohio Graduation Tests (Grade 10) Reading, March 2004

28 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Ohio Graduation Tests (Grade 10) Mathematics, March 2004

29 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

30 Section IV - Reliability and Validity; Other Technical Considerations A. Summary of studies for reliability, available data B. Summary of studies for validity, available data Face validity studies Concurrent validity studies Consequential validity studies C. Other technical considerations

31 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Section V—Appendices Appendix A Documentation of development principles, partners, process, research base Appendix B Documentation of training provided, attendance, quality control Appendix C Documentation of scoring protocols, process, quality control Appendix D Formal evaluation data if available Appendix E Standard setting report

32 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Who are the learners who take alternate assessments? How does the type and size of the population vary in terms of learner characteristics, available response repertoires, and complex medical conditions? How do the variations of who the learners are affect the assessment triangle, and ultimately technical adequacy studies? What does the literature say about how students in this (these) population(s) learn? How do current theories of learning in the typical population apply to this population?

33 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen How is technical adequacy defined? What is meant by reliability, validity? How do traditional definitions of reliability/validity apply to alternate assessments? What are technical adequacy issues in alternate assessments that can not be resolved with the current knowledge-base in large-scale assessment? What strategies can be used to resolve these issues?

34 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen What consequential validity issues (intended and unintended consequences) challenge the foundational assumptions in an alternate assessment? What is the relationship between foundational assumptions of alternate assessments and technical adequacy issues? What lessons learned from alternate assessment need to be addressed for the general assessment as well?

35 2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Next Steps Define the learners, and determine how this differs across states Build consensus on a theory of learning in the academic content domains for these students Step out of our specializations and think together about these challenges


Download ppt "2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Measurement Purgatory or Best Practice? Alternate Assessment for Students with Significant."

Similar presentations


Ads by Google