New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.

Slides:



Advertisements
Similar presentations
1 April 11,  Doug Kosty, Assistant Superintendent  Kimberly Harrington, ELD Teacher, Hillsboro School District ELPA Content Panel Member  Michelle.
Advertisements

Through Instructional Rounds
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
[Insert faculty Banner] Consistency of Assessment
Spiros Papageorgiou University of Michigan
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Setting Performance Standards Grades 5-7 NJ ASK NJDOE Riverside Publishing May 17, 2006.
Consistency of Assessment
Alternative Maryland School Assessment (Alt-MSA)
Prepared by Jan Sheinker, Ed.D Points of view or opinions expressed in the paper are not necessarily those of the U.S. Department of Education, or Offices.
MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity Daniel J. Wiener, Administrator of Inclusive Assessment.
CLOSING THOUGHTS The long and winding road of alternate assessments Where we started, where we are now, and the road ahead! Rachel F. Quenemoen, Senior.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Consequential Validity Inclusive Assessment Seminar Elizabeth.
Conceptualizing Performance Standards for Alternate Assessments Steve Ferrara American Institutes for Research Suzanne Swaffield South Carolina Department.
National Center on Educational Outcomes N C E O What the heck does proficiency mean for students with significant cognitive disabilities? Nancy Arnold,
1 Some Key Points for Test Evaluators and Developers Scott Marion Center for Assessment Eighth Annual MARCES Conference University of Maryland October.
Setting Alternate Achievement Standards Prepared by Sue Rigney U.S. Department of Education NCEO Teleconference March 21, 2005.
Large Scale Assessment Conference June 22, 2004 Sue Rigney U.S. Department of Education Assessments Shall Provide for… Participation of all students Reasonable.
Consistency/Reliability
June 23, 2003 Council of Chief State School Officers What Does “Proficiency” Mean for Students with Cognitive Disabilities Dr. Ron Cammaert Riverside Publishing.
2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Measurement Purgatory or Best Practice? Alternate Assessment for Students with Significant.
Assessment Population and the Validity Evaluation
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments 1 Introduction to Comparability Inclusive Assessment Seminar.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Assessing Student Learning
performance INDICATORs performance APPRAISAL RUBRIC
Consistency of assessment Technology subjects (7-12)
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Wisconsin Extended Grade Band Standards
Exploring Alternate AYP Designs for Assessment and Accountability Systems 1 Dr. J.P. Beaudoin, CEO, Research in Action, Inc. Dr. Patricia Abeyta, Bureau.
Overview of Standard Setting Leslie Wilson Assistant State Superintendent Accountability and Assessment August 26, 2008.
Setting Performance Standards for the Hawaii State Alternate Assessments: Reading, Mathematics, and Science Presentation for the Hawaii State Board of.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Including Quality Assurance Within The Theory of Action Presented to: CCSSO 2012 National Conference on Student Assessment June 27, 2012.
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
EdTPA Teacher Performance Assessment. Planning Task Selecting lesson objectives Planning 3-5 days of instruction (lessons, assessments, materials) Alignment.
A Principled Approach to Accountability Assessments for Students with Disabilities CCSSO National Conference on Student Assessment Detroit, Michigan June.
Student Growth Measures in Teacher Evaluation Using Data to Inform Growth Targets and Submitting Your SLO 1.
IDEA and NCLB Standards-Based Accountability Sue Rigney, U.S. Department of Education OSEP 2006 Project Directors’ Conference.
Illustration of a Validity Argument for Two Alternate Assessment Approaches Presentation at the OSEP Project Directors’ Conference Steve Ferrara American.
Raising the Bar for Oregon. Adopt New Math Cut Scores and Final Math Achievement Level Descriptors and Policy Definitions Adopt High School Math Achievement.
Virginia Alternate Assessment Program Standard Setting Richmond, Virginia September 18-21, 2001.
Standard Setting Results for the Oklahoma Alternate Assessment Program Dr. Michael Clark Research Scientist Psychometric & Research Services Pearson State.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
0 PARCC Performance Level Setting Place your logo here.
COUNCIL OF CHIEF STATE SCHOOL OFFICERS (CCSSO) & NATIONAL GOVERNORS ASSOCIATION CENTER FOR BEST PRACTICES (NGA CENTER) JUNE 2010.
Student Learning Objectives. Introductions Training Norms Be present Actively participate in activities Respect time boundaries Use electronics respectfully.
Common Core Standards English Language Arts 1. Overview of the Initiative o State-led and developed Common Core Standards for K-12 in English Language.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
CCSSO Task Force Recommendations on Educator Preparation Idaho State Department of Education December 14, 2013 Webinar.
Policy Definitions, Achievement Level Descriptors, and Math Achievement Standards.
Vertical Articulation Reality Orientation (Achieving Coherence in a Less-Than-Coherent World) NCSA June 25, 2014 Deb Lindsey, Director of State Assessment.
How was LAA 2 developed?  Committee of Louisiana educators (general ed and special ed) Two meetings (July and August 2005) Facilitated by contractor.
Presentation to the Nevada Council to Establish Academic Standards Proposed Math I and Math II End of Course Cut Scores December 22, 2015 Carson City,
Setting Performance Standards EPSY 8225 Cizek, G.J., Bunch, M.B., & Koons, H. (2004). An NCME Instructional Module on Setting Performance Standards: Contemporary.
High School Proficiency Exam Nevada Department of Education.
Policy Definitions, Achievement Level Descriptors, and Math Standards.
School – Based Assessment – Framework
Next-Generation MCAS: Update and review of standard setting
Federal Policy & Statewide Assessments for Students with Disabilities
Standard Setting for NGSS
Timeline for STAAR EOC Standard Setting Process
Assessment Population and the Validity Evaluation
Deanna L. Morgan The College Board
Presentation transcript:

New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne Perie

OBSERVATION INTERPRETATION COGNITION  Student Population  Academic Content  Theory of Learning  Assessment System  Test Development  Administration  Scoring  Reporting  Alignment  Item Analysis/DIF/Bias  Measurement Error  Scaling and Equating  Standard Setting VALIDITY EVALUATION  Empirical Evidence  Theory and Logic (argument)  Consequential Features The Assessment Triangle and Validity Evaluation (Marion, Quenemoen, & Kearns, 2006)

3 Regulation on alternate achievement standards §200.1 “For students… with the most significant cognitive disabilities, who take an alternate assessment, a State may, through a documented and validated standards-setting process, define alternate achievement standards…”

4 Achievement standard Label Descriptor Student work Cut score(s) 4 Components

5 Alternate vs. General Assessment Much is the same: – Write distinct, detailed performance level descriptors (PLDs) that link directly to the content standards – Use a documented and validated standard setting methodology – Convene a panel of educators and stakeholders who are familiar with the population of students tested and represent the diversity of the state – Train them in the content standards, PLDs, and methodology – Collect individual judgments and use group discussion across rounds – Provide normative feedback to the committee to show the relationship between the cut scores and results – Aggregate judgments to determine final panel-recommended cut scores – Convene appropriate policymaker(s), apprise them of the recommendations, and have them legally adopt the cut scores – Document the PLDs, rationale and procedures for the methodology, selection of panelists, training provided, ratings, and variance measures

6 Alternate vs. General Assessment Only a few details differ: – PLDs can be written to a grade span rather than a grade level if adjoining grades are sufficiently similar. – PLDs range from entry points to exit points but must represent the highest standard possible for this population. – Panels are more likely to include parents and other stakeholders involved in this special population.

7 Key Considerations 1. Writing clear, descriptive performance level descriptions aligned with the state content standards: The task of writing PLDs is similar to that for large-scale assessments. They must be written by people who understand the population and what can be achieved. 2. Matching the judgmental task of the standard-setting method to the test type: Methods can focus on items, performance tasks, whole bodies of work, or students. Choose the methodology, or an adaptation, that best fits the assessment. 3. Validating that the cut score best reflects the intention of the performance level descriptor: It is important to conduct follow- up studies to ensure the “right” students are being placed in each performance level.

8 PLDs Requirements – Represent highest standard possible for this population – Capture essential skills – Align with state content standards – Clearly differentiate among levels – Progress logically across levels (e.g., is Proficient appropriately higher than Basic) – Progress logically across grade levels (e.g., is grade 5 proficient sufficiently more advanced than grade 3 proficient) – Represent knowledge and skills that can be evaluated by the assessment (e.g., don’t discuss independence in the PLD if your assessment doesn’t measure independence) Options – Write PLDs for a grade span rather than for a grade level

9 Basic Steps for Writing PLDs 1. Convene a panel of stakeholders for each subject and grade span 2. Provide information about the students and assessment 3. Share relevant literature about what students with disabilities can learn and do 4. Share sample student work 5. Discuss requirements for PLDs 6. Provide sample PLDs 7. Ask panel to draft PLDs as a group; work for consensus 8. Compare PLDs across panels (subject/grade span) and revise as needed 9. Final revisions and adoption by policymakers

10 Choosing a Method Match the judgmental task of the method to the types of data gathered through your assessment – (E.g., Angoff or Bookmark work better with assessments with multiple items while Body of Work or a profile approach work better with a portfolio or holistic assessment) See handout summarizing methods and corresponding assessment types

11 Adopting a Cut Score Consider smoothing your results across grades or across subjects so that your impact data make sense. Cut scores should be adopted by a policy body (e.g., state board) who is provided with relevant information. – Panel-recommended cut scores – Smoothed cut scores – Variance measures (SEJ/SEM) – Impact data (could be broken out by disability)

12 Validating the Performance Standards Cut scores may be considered provisional until they have been used and validated. Validation strategies: – Teacher judgments on students (contrasting groups) – Longitudinal data on students’ progression across grade-levels (Stay in performance level? Move up/down? How much movement?) – External measures (other evaluations, grades) Monitor effects of using performance levels over time

13 Checklist for Performance Standards Understandable and useful for stakeholders Clearly differentiate among levels Grounded in student work but not tied to status quo Built by consensus Focused on learning CCSSO Handbook, p. 16