Amy Clark, Meagan Karvonen, Russell Swinburne Romine, & Brooke Nash

Slides:



Advertisements
Similar presentations
The Teacher Work Sample
Advertisements

Dynamic Learning Maps (DLM)
How the Assessment System Works Required Training Module 4.
A Guide for IEP Teams Including Students with Disabilities in State and District- wide Assessment.
1 of 22 DLM Science Alternate Assessment Additional Training Module 8.
Missouri Department of Elementary and Secondary Education
Dynamic Learning Maps Alternate Assessment Consortium Patti Whetstone Center for Educational Testing and Evaluation University of Kansas The present publication.
Alignment Issues for AA- AAS Diane M. Browder, PhD University of North Carolina at Charlotte October 11, 2007.
Overview of the Dynamic Learning Maps Alternate Assessment System
Dynamic Learning Maps Alternate Assessment Consortium The present publication was developed under grant X from the U.S. Department of Education,
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
The Five New Multi-State Assessment Systems Under Development April 1, 2012 These illustrations have been approved by the leadership of each Consortium.
Adapted from: PARCC Model Content Frameworks English Language Arts/Literacy October 2011.
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
MELROSE PUBLIC SCHOOLS MELROSE VETERANS MEMORIAL MIDDLE SCHOOL OCTOBER 2013 MCAS Spring 2013 Results.
Interim Assessments Overview of Spring 2015 Assessments Training Module.
Wisconsin Extended Grade Band Standards
Alternate Assessment Changes. 9/14/20152 Important Information that You Need to Know The new Alternate Assessment will be a test given to students.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Butte County Office of Education September 19, 2014 Interim.
Fall Testing Update David Abrams Assistant Commissioner for Standards, Assessment, & Reporting Middle Level Liaisons & Support Schools Network November.
 Closing the loop: Providing test developers with performance level descriptors so standard setters can do their job Amanda A. Wolkowitz Alpine Testing.
Next Generation Standards April 27, 2012.
The present publication was developed under grant X from the U.S. Department of Education, Office of Special Education Programs. The views.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Santa Clara COE Assessment Accountability Network September.
Illustration of a Validity Argument for Two Alternate Assessment Approaches Presentation at the OSEP Project Directors’ Conference Steve Ferrara American.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
Dynamic Learning Maps Alternate Assessment Transtions in West Virginia Melissa Gholson Office of Assessment.
State Board of Education Presentation Jeffrey Hauger, Ed.D. Peggy McDonald, Ed.D. Elizabeth Celentano, M.Ed. January 11, 2016 NEW JERSEY DYNAMIC LEARNING.
DLM Alternate Assessment Test Administration Webinar March 2016.
Proposed End-of-Course (EOC) Cut Scores for the Spring 2015 Test Administration Presentation to the Nevada State Board of Education March 17, 2016.
Jessica Bowman, USBE. Share one thing about your special education program that you are proud of. Share one thing you are excited to do this summer. Get.
Performance Wisconsin Student Assessment System
Achieving a Common Core
California Assessment of STUDENT PERFORMANCE and PROGRESS
Florida Standards Alternate Assessment
What about the Assessment System?
New Developments in NYS Assessments
What is a CAT? What is a CAT?.
Links to GAA Resources and Presentations are on the GaDOE website (shown below)
American Institutes for Research
Parent Guide to Using Lexile Scores Provided on the Georgia Milestones Individual Score Reports Using the Lexile Score to support the growth of your child’s.
Dynamic Learning Maps: Using Essential Elements to plan instruction
Overview of Assessments
Improving the Accessibility of Locally Developed Assessments CCSSO National Conference on Student Assessment 2016 Phyllis Lynch, PhD Director, Instruction,
Alaska Superintendents Association Fall Meeting 2016
Welcome Teachers! Jessica Bowman, USOE.
Laurene Christensen, Ph.D. Linda Goldstone, M.S.
M-Step Overview, Practice and Instructional Implications
Office of Education Improvement and Innovation
Dynamic Learning Maps Alternate Assessment Consortium
INFORMATIONAL SESSION
Overview of Spring 2015 Assessments
State Board of Education Presentation Jeffrey Hauger, Ed.D.
PARCC Assessments Overview
Lindsay Ruhter, Lori Andersen, & Allison Lawrence
Summative: Formative resources: Interim Assessments:
Connecticut Core Standards for Mathematics
Claudia Flowers, Diane Browder, & Shawnee Wakeman UNC Charlotte
Shasta County Curriculum Leads November 14, 2014 Mary Tribbey Senior Assessment Fellow Interim Assessments Welcome and thank you for your interest.
Interim Assessment Training NEISD Testing Services
Kathy Cox State Superintendent of Schools GPS Day 3 Training
Smarter Balanced Assessments
School Test Coordinators Training (Required for DTCs and STCs)
WAO Elementary School and the New Accountability System
Parent Guide to Using Lexile Scores Provided on the Georgia Milestones Individual Score Reports Using the Lexile Score to support the growth of your child’s.
California Assessment of Student Performance and Progress (CAASPP)
New Assessments and Accommodations
Claudia Flowers, Diane Browder, & Shawnee Wakeman UNC Charlotte
Presentation transcript:

Amy Clark, Meagan Karvonen, Russell Swinburne Romine, & Brooke Nash Exploring Teacher Choice When Using an Instructionally Embedded Alternate Assessment System Amy Clark, Meagan Karvonen, Russell Swinburne Romine, & Brooke Nash

Session Overview Background on the assessment system & population Summary of teacher choice using instructionally embedded assessment during 2016-2017 Implications and next steps

Assessment overview

Background The DLM consortium administers assessments to students with significant cognitive disabilities Five states participate in the integrated model blueprint, which provides summative results based on testing conducted throughout the year for English language arts and mathematics Assessment designed to occur alongside instruction and inform subsequent instructional decision making

Creation of Instructional Plans Teachers create instructional plans using an online system They select the content standard and level at which they want to instruct and assess the student Alternate achievement standards are “Essential Elements” Assessments are available at five levels, known as linkage levels, for each content standard Initial Precursor Distal Precursor Proximal Precursor Target Successor

Assessments at Different Levels Here is one content standard and how the skill measured varies at the five linkage levels. Goal of the system is to align instruction and assessment at a level that is suitable for the student’s needs

Blueprint Flexible design is intended to allow teachers to assess students at a frequency and level that best meets their students’ needs, IEP goals, etc. Standards are organized within Claims and Conceptual Areas of similar content The blueprint specifies content standards available and guidelines for selection for each grade and subject E.g. Choose 3 standards within Conceptual Area 1.1 Choose standards that make sense for instruction given IEP goals, instructional needs, etc.

Issues to Consider for Instructionally Embedded Assessments Consider how we define fidelity in context of an assessment that intentionally allows for teacher choice in depth, breadth, and frequency of assessment Examine differences in administration patterns and how they relate to student performance Determine the implications for the validity of inferences made from results when there is intended flexibility in student testing experience just now have the first full year of implementation data so we are starting by understanding what teachers have chosen to do

Research Questions When are the peak times during which teachers choose to administer more testlets? Do teachers select the linkage level recommended by the system or a different level? Which standards do teachers tend to choose from among those available on the blueprints? To what extent do teachers assess the same student more than once on a standard?

Participation 13,334 students with significant cognitive disabilities from 5 states 4,241 teachers created instructional plans and administered testlets Each instructional plan is measured by a 3-8 item testlet Measures a single content standard at a single linkage level selected by the teacher Total of 201,348 testlets were administered during 2016-2017 instructionally embedded testing

Teacher choice within the system

RQ 1: Peak Testing Patterns The 2016-2017 instructionally embedded window was available from September through February for teachers to administer assessments covering the full blueprint Teachers have choice of when and how frequently to assess their students within that time period

Peak Testing by Week When given the flexibility to test when they choose, most testlets were administered in the weeks leading up to winter break and to the end of the testing window. Less frequent testing at the beginning of the window and immediately following the winter break may indicate greater fidelity, because the intention is for instruction to occur prior to assessment.

Average Number of Testlets Administered to Students per Week Across content areas Based on students who tested in that week Seems to indicate there may be two types of users: Those who take a testlet or two per content area each week, likely as instruction occurs Those who take all testlets in each content area in a week or two and do not further test (more of an interim testing model)

RQ 2: System-Recommended Linkage Level Prior to testing, all teachers complete a survey for each student of learner characteristics Responses to items in ELA, math, and expressive communication result in a complexity band for each content area Four total complexity bands: Foundational, Band 1, Band 2, Band 3

Correspondence of Complexity Bands to System-Recommended Linkage Level Foundational Initial Precursor Band 1 Distal Precursor Band 2 Proximal Precursor Band 3 Target Successor Teacher can choose to assign

ELA Adjustment from System-Recommended Level Change Foundational Band 1 Band 2 Band 3 n % -3 N/A 0.0 347 3.0 -2 2,528 6.6 1,014 8.6 -1 7,437 20.9 6,429 16.7 1,867 15.9 13,352 88.8 25,363 71.4 27,389 71.3 8,190 69.8 1 965 6.4 2,049 5.8 1,646 4.3 315 2.7 2 487 3.2 463 1.3 426 1.1 3 140 0.9 215 0.6 4 85 Because the successor level is not recommended, a change of -4 levels from recommended is not available. In most instances (approx 75%) , teachers did not adjust the level from what was recommended as indicated by the highlighted row When teachers do adjust the linkage level from the level recommended, they tend to choose a testlet at one linkage level below the level assigned to the student. n = instructionally embedded instructional plans

Math Adjustment from System-Recommended Level Change Foundational Band 1 Band 2 Band 3 n % -3 N/A 0.0 162 2.1 -2 2,420 6.1 598 7.8 -1 8,435 22.4 6,243 15.8 952 12.3 14,821 94.1 27,280 72.6 28,541 72.1 5,788 75.0 1 640 4.1 1,337 3.6 2,104 5.3 216 2.8 2 161 1.0 450 1.2 261 0.7 3 95 0.6 91 0.2 4 33 Math results are similar – teachers adjusted even less than in ELA n = instructionally embedded instructional plans

Testlets Administered at Each Linkage Level % Initial Precursor 49,502 24.6 Distal Precursor 68,533 34.0 Proximal Precursor 62,795 31.2 Target 18,876 9.4 Successor 1,642 0.8 Teachers overwhelmingly chose to administer testlets to students at the lowest three linkage levels Only 10% of testlets were administered at the T or S level regardless of system recommendation

RQ 3: Most Selected Standards Blueprint incorporates teachers flexibility so that instruction and assessment occur in areas most relevant to the student’s instructional plan and IEP goals Blueprint requirements allow teacher choice: e.g. Choose 3 EEs within Conceptual Area 1.1 Interested in which EEs teachers actually choose Implications for students’ opportunity to learn

Grade 3 ELA example Writing EEs in Conceptual area 2.1 are required (and most selected accordingly) – More EEs available in C1.1, teachers tend to favor RL3.1 and avoid RI3.5 and RL 3.2 These findings may indicate decreased opportunity to learn the less-frequently chosen EEs or indicate an area for additional professional development support for teachers to feel comfortable instructing and assessing those content standards

RQ 4: Testing Same Standard Multiple Times As instruction occurs, teachers can choose to create additional instructional plans to re-assess the content standard Can be at same linkage level or a different linkage level Gets at idea of depth of instruction (versus breadth)

Given that a particular EE was tested on more than once, 90% of students tested on it twice This graph shows the number of EEs students tested on multiple times. Again, most students tested on two EEs multiple times – contributes to the overall picture of how much students were using the assessment system and how much information was available to teachers to inform instruction

Testing on Multiple Linkage Levels in a Standard 2,604 (19.5%) tested on more than one linkage level within a standard Of students who assessed the same standard at more than one linkage level, most assessed at two different linkage levels (mean = 2.1, median = 2) However, in 23 instances across all students and standards (0.01%), the students tested on all five linkage levels within the standard May be an indication of lack of understanding on teachers’ part about how system is intended to be used (e.g. not knowing the scoring model infers across levels tested so thinking students need to test at all levels to receive “credit”). Similarly, not wanting to frustrate IP level students on content that will be too challenging.

Frequency of Level Assessed More Than Once Across All Students and Standards 2.5% of the time, student tested on the same linkage level for the standard more than once Linkage Level n % Initial Precursor 1,182 23.5 Distal Precursor 1,641 32.6 Proximal Precursor 1,569 31.2 Target 633 12.6 Successor 7 0.1 In only 7 instances across all students and EEs was the successor level tested multiple times for the EE (i.e. more than one successor testlet administered for the EE)

Discussion

Summary of Results Overall patterns of use show students have at least appropriate content coverage Teachers generally do not override system recommendations System appears to assign testlets at the correct level for students to access the content May still have practice in place of using system to meet requirements rather than to inform instruction AA-AAS historically seen as fulfilling legislative mandate rather than providing feedback on student performance (Nitcsh, 2013)

Implications for Fidelity Expectation for some minimum threshold of use (e.g., full blueprint coverage) To fulfill goal of informing instruction, ranges of actions are possible Retesting on a standard, if time lapse between tests and instruction occurred Testing fewer testlets in more weeks vs. in shorter, focused time blocks – may also be guided by state policies What actions are outside the likely bounds of useful assessment? E.g., test on all standards and levels in a short time period

Next Steps After spring 2017 data is collected: Is there a relationship between use of the instructionally embedded assessment system and students’ summative assessment results? Teacher survey data collection currently underway to gain feedback on choices made during instructionally embedded testing and how progress reports were used to inform instruction Defining a measure of implementation fidelity Looking at within-student and within-teacher experience for testlet administration

THANK YOU! For more information, please visit dynamiclearningmaps.org akclark@ku.edu