Download presentation
Presentation is loading. Please wait.
1
MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity Daniel J. Wiener, Administrator of Inclusive Assessment University of Maryland – Alternate Assessment Conference October 11-12, 2007
2
University of Maryland – Alternate Assessment Conference October 11-12, 2007 2 Participation: Thinking Differently About Who Needs an Alternate Assessment MCAS-Alt is intended for Students with significant cognitive disabilities AND Students who focus on attaining grade-level achievement standards, but who cannot fully demonstrate knowledge and skills on the test, even with accommodations State has aligned instruction from lowest level of complexity to grade-level expectations Implications for scoring and reporting results –Alternate achievement standards –Grade level achievement standards
3
University of Maryland – Alternate Assessment Conference October 11-12, 2007 3 Reporting Results Meaningful performance levels reported for MCAS-Alt, while acknowledging performance is below grade-level expectations A student can attain real proficiency through the alternate assessment based on grade-level achievement standards AdvancedProficient Needs Improvement Warning (Failing at Grade 10) Adv.Prof. Needs Imp. ProgressingEmergingAwareness Performance Levels MCAS Test: MCAS-Alt:
4
MCAS-Alt: A “structured portfolio” Work samples/video/photo evidence (performance), and data charts (progress) are compiled in an annual portfolio Evidence shows complexity of tasks, and student’s accuracy and independence in performing tasks aligned with required subjects/strands/standards Data chart % Accuracy % Independence 100% 80% 60% 40% 20% 0% 12/1/0612/2/0612/3/0612/4/0612/5/06 4
5
University of Maryland – Alternate Assessment Conference October 11-12, 2007 5 Sometimes, It Seems Like This….
6
University of Maryland – Alternate Assessment Conference October 11-12, 2007 6 …It Could Be More Like This… Learning Standards Entry Points
8
University of Maryland – Alternate Assessment Conference October 11-12, 2007 8 Used to calculate the Performance Level: Completeness of portfolio Level of Complexity (difficulty of standards) Demo of Skills and Concepts (accuracy) Independence (cues/prompts/assistance) Plus, Self-Evaluation (monitor, self-correct, reflect) Generalization (varied instructional approaches) Scoring Criteria
9
University of Maryland – Alternate Assessment Conference October 11-12, 2007 9 MCAS-Alt Scoring Rubric: Demonstration of Skills and Concepts How accurate were the student’s responses? M1234 The portfolio strand contains insufficient information to determine a score. Student’s performance is primarily inaccurate in this strand. (0-25% accurate) Student’s performance is limited and inconsistent with regard to accuracy in this strand. (26-50% accurate) Student’s performance is mostly accurate in this strand. (51-75% accurate) Student’s performance is accurate and of consistently high quality in this strand. (76-100% accurate)
10
University of Maryland – Alternate Assessment Conference October 11-12, 2007 10 MCAS-Alt Scoring Rubric: Independence To what degree were prompts used; How independent were the student’s responses? M1234 The portfolio strand contains insufficient information to determine a score. Student requires extensive verbal, visual, and physical assistance to demonstrate skills and concepts in this strand. (0-25% independent) Student requires frequent verbal, visual, and physical assistance to demonstrate skills and concepts in this strand. (26-50% independent) Student requires some verbal, visual, and physical assistance to demonstrate skills and concepts in this strand. (51-75% independent) Student requires minimal verbal, visual, and physical assistance to demonstrate skills and concepts in this strand. (76-100% independent)
11
11 Setting Performance Levels Use score combinations to describe characteristics of student’s performance: Reasoned Judgment Example: LC=3, Acc=4, Ind=3 shows student’s performance is primarily accurate and independent, although below expectations for grade level. Example: LC=3, DSC=2, Ind=2 shows student’s performance is limited/inconsistent and student requires frequent prompting/assistance. University of Maryland – Alternate Assessment Conference October 11-12, 2007
12
Score Combination Tables Level of Complexity=3 2007 CCSSO Large Scale Assessment Conference Making a Case for MCAS-Alt Validity12 University of Maryland – Alternate Assessment Conference October 11-12, 2007 Level of Complexity=2 Demo of Skills: Independence: 1 2 3 4 (0-25%) 1 Aw (26-50%) 2 Aw Em (51-75%) 3 Aw Em Pg (76-100%) 4 Aw Em Pg Demo of Skills:
13
Score Combination Tables (continued) Level of Complexity=5 13 University of Maryland – Alternate Assessment Conference October 11-12, 2007 Level of Complexity=4 Demo of Skills: Independence: Demo of Skills:
14
University of Maryland – Alternate Assessment Conference October 11-12, 2007 14 Technical Validity and Reliability: Some Tricky Areas for MCAS-Alt “Test item inter-relationship” But, tasks are selected and/or designed by teachers, and There is little standardization across portfolios “Assessment reflects full range of content standards” But non-regulatory guidance says these students won’t necessarily access all the standards, and Portfolios cannot cover all the standards, only those that were taught Validate that targeted skills shown in the evidence are based on grade-level content standards Is an external alignment study necessary? “Reliability of scores” when responses are so diverse One purpose of MCAS-Alt: Instructional improvement How to document that this occurred?
15
University of Maryland – Alternate Assessment Conference October 11-12, 2007 15 “Did the MCAS-Alt Meet Its Intended Purposes?” Tell our story: Did the assessment do what we said it would do? If not, how did we fix it? This criterion allowed us to document… Whether the student was provided access to curriculum Whether new, challenging skills were taught How well student learned new skills, concepts, content Whether teaching and learning improved as a result of MCAS-Alt
16
University of Maryland – Alternate Assessment Conference October 11-12, 2007 16 Document What Happened: Validating the Development Process We tried to get the right people at the table We carefully documented all decisions: Determine purpose(s) of the alternate assessment What we want to measure (scoring rubric) Describing the student’s performance (descriptors) Calculating a score (scoring rules) Translating scores into performance levels (standard setting) Where one PL ends and another begins (cut scores) Aligning content and validating the alignment Continuous improvements to the system
17
University of Maryland – Alternate Assessment Conference October 11-12, 2007 17 Who Contributed to the Validation Process? Curriculum Framework writers served on panels to develop the Resource Guide to the Frameworks for Students with Disabilities Content specialists defined the “essence” of standards and “entry points” at various levels of complexity Special educators pushed them to go lower Diverse stakeholders shared their perspectives Technical advisors helped set performance standards, using reasoned judgment of each “score combination” Contractors told us what others had tried, and what might work Scorers linked the portfolio evidence to the required standard using the Resource Guide, with 94% IRC
18
University of Maryland – Alternate Assessment Conference October 11-12, 2007 18 MA Department of Education (781-338-3625) Dan Wiener – dwiener@doe.mass.edu MCAS-Alt Website: www.doe.mass.edu/mcas/alt Resources
19
MCAS-Alt: The Evolution of a Validity Argument Charles A. DePascale National Center for the Improvement of Educational Assessment University of Maryland – Alternate Assessment Conference October 11-12, 2007
20
University of Maryland – Alternate Assessment Conference October 11-12, 2007 20 The Evolution of a Validity Argument Defining the purposes of the assessment Identifying the multiple uses of the assessment and the populations of students Specifying the inferences that would be supported by the assessment Determining that one “set of rules” and procedures would not be sufficient
21
University of Maryland – Alternate Assessment Conference October 11-12, 2007 21 The Evolution of a Validity Argument Designing the system Building checks and balances into the system Documentation: –Understanding the extent to which documentation is the system –Understanding the importance of documentation of the system
22
University of Maryland – Alternate Assessment Conference October 11-12, 2007 22 The Evolution of a Validity Argument Flexibility and Standardization (Gong & Marion, 2006) Making decisions about where to be flexible and where it is necessary to standardize. Making adjustments to enhance validity Adopting an continual improvement approach Determining when and how to make changes to improve the system.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.