Presentation is loading. Please wait.

Presentation is loading. Please wait.

Delivery of Test Accessibility Tools and Accommodations Across 3 Platforms: Implementation, Success, and Lessons Learned Trinell Bowman, Maryland Melissa.

Similar presentations


Presentation on theme: "Delivery of Test Accessibility Tools and Accommodations Across 3 Platforms: Implementation, Success, and Lessons Learned Trinell Bowman, Maryland Melissa."— Presentation transcript:

1 Delivery of Test Accessibility Tools and Accommodations Across 3 Platforms: Implementation, Success, and Lessons Learned Trinell Bowman, Maryland Melissa Gholson, West Virginia Jen Paul, Michigan Laurene Christensen, NCEO Cara Laitusis, ETS

2 The Charlie’s Angels of Accessibility and Accommodations
Trinell Bowman Melissa Gholson Jen Paul

3 Overview of the presentation
Three big questions: How were accessibility features and accommodations implemented? Were there performance differences observed? What were your state’s lessons learned? Discussion: Cara Laitusis, ETS Amy Hewitt

4 How were accessibility tools and accommodations (for English learners, students with disabilities, and general education students) were implemented on each platform?

5 Accessibility System for All Students
Audio Amplification; Bookmark*; Eliminate Answer Choices; Highlight Tool; Line Reader Tool; Magnifier; Note Pad; Pop-up Glossary; Spell Check*; Writing Tools Blank Scratch Paper; General Administration Directions Read Aloud, Repeated, or Clarified; Headphones or Noise Buffers; Enlargement Device; Redirect Student to the Test; Spell Check Device; Student Reads Assessment Aloud to Him- or Herself Answer Masking, Color Contrast (Background/Font Color), Text-to-Speech for Mathematics, Human Reader/Human Signer for Mathematics, General Line Reader masking Online Translated Assessment (Mathematics); Text-to-Speech in Spanish (Mathematics), Extended Time; Word-to-Word Dictionary; Speech-to-Text Device; Human Scribe; Test Directions Read Aloud in Native Language; Paper Math Translation; Large Print Math Spanish Assessment; Human Reader for Spanish Mathematics Assessments Screen Reader, Closed Captioning, Text-to-Speech ELA, ASL Video, Assistive Technology, Refreshable Braille, Braille, Large Print, Paper-Based Assessment, Human Reader/Signer ELA, Student Reads Assessment Aloud, Braille Note Taker, Braille Writer, Calculation Device, Mathematics Tools, External Speech-to-Text, Scribe Word Prediction Device, Extended Time

6 Accessibility Features and Accommodations
For Accessibility Features in Advance (Tier 2) and Accommodations (Tier 3) PARCC implemented the use of a Personal Needs Profile (PNP). The Personal Needs Profile (PNP) is a collection of student information regarding a student’s testing condition, materials, or accessibility features and accommodations that are needed to take a PARCC assessment. During Year 2 of the PARCC assessment, the student registration file and the PNP were combined into one file layout now called the Student Registration/Personal Needs Profile or SR/PNP. The testing platform used is Pearson’s TestNav system.

7 SR/PNP Collects Embedded Accessibility features, identified in advance, that need to be enabled (via SR/PNP File Layout/User Interface). Ex: Text-to-Speech for Mathematics or Color Contrast Embedded Accommodations that need to be enabled (via PNP File Layout/User Interface). Ex: Screen Reader, Closed Captioned, Spanish Text-to-Speech Externally-provided administrative considerations, accessibility features identified in advance, and accommodations (via PNP File Layout/User Interface). Frequent Breaks, Speech to Text, Calculator Paper-based accommodated forms that required advance shipping (via Student Registration File). Braille, Large Print or Spanish Test Book

8 Who Collects Information for Student PNPs
Students with disabilities: IEP team or 504 plan coordinator English learners: Educator(s) responsible for selecting accommodations English learners with disabilities: IEP team, including educator(s) familiar with the language needs of the student or 504 plan coordinator Students without disabilities and who are not English learners: Team may include: Student (as appropriate); Parent/guardian; and Student’s primary educator in the subject of the assessment. For students who have the calculator and mathematics tools Arithmetic tables (e.g., addition charts, subtraction charts, multiplication charts; division charts) Two-color chips (e.g., single-sided or double-sided) Counters and counting chips Square tiles Base 10 blocks • 100s chart

9 ELA/Literacy Practice Test Accommodations
ELA/Literacy Practice Test Accommodations Mathematics Practice Test Accommodations

10 Examples of Embedded Accommodations
Text-to-Speech American Sign Language Video

11 Mathematics Assessments in Spanish
Online Spanish Paper Spanish Spanish Text-to-Speech Large Print Spanish

12 Michigan Department of Education
Jennifer M. Paul EL & Accessibility Assessment Specialist

13 Supports/Accommodations Implementation
Text-to-speech – Items only ELA: Grades 3-5 Math: Grades 3-8, 11 Text-to-speech – Items and Passages ELA: Grades 6-8, 11 Stacked Spanish Math: Grades 3-8, 11 Designated Support Accommodation Designated Support

14

15

16

17 Accessibility and Accommodations: Implementation and Lessons Learned
Melissa Gholson, Ed.D. Coordinator Office of Assessment 2014 SSDI Day 1

18 Accessibility & Implementation
Discuss how the accessibility tools and accommodations (for English learners, students with disabilities, and general education students) were implemented. 2014 SSDI Day 1

19 West Virginia A Systems Perspective
Research, & Evaluation Peer Review Requirements Policies & Participation Guidelines WVEIS Data Processes IEP, 504, LEP and SAT Plans Administration, Security and Monitoring Process: WVS 326 WV has a strong commitment to improving outcomes and for providing access for all learners. A model of our system and the way we have built and review our system is shown here. We have committed to a continuous improvement model. All of the processes of our system is outlined in our state guidelines for participation which is reviewed by a stakeholder advisory and updated annually. Contains current data system codes and processes for the selection, provision and monitoring of supports/accommodations

20 System of Delivery Each feature has a code for our system.
Supports and accommodations for each student with a plan is displayed for review by test administrators. Data is refreshed daily and a nightly upload is delivered to the vendor. This automatically enables the embedded features for each student.

21 General Summative Assessment
Implementation General Summative Assessment Number of Students Tested N 190,492 % Without Accommodations 169,713 89% With Accommodations 20,779 11% WVS.326 Monitoring Process Form How we documents the provision of Supports & Accommodations 2014 SSDI Day 1

22 Accommodations Summary Data
Stacked translation ASL Video TTS TTS including passages Read aloud Read aloud including passages Separate Setting Translated Test Directions Paper tester Braille Paper Braille Refreshable Code P32 P34 P01 P13 P02 P14 T09 P30 P19 P03 P17 N 70 43 16,786 1,251 13,131 51 2,471 91 123 64 10 % Testers Using Accommodations (20,779) .003 .002 .81 .06 .63 .12 .004 .006 .0005

23 Challenges Assistive technology-JAWs, RBD, Technology
Lack of familiarity of the system use Frequent plan changes Local set versus automatic upload Making sure test administrators checked all features were working properly prior

24 Supporting Access Issues
Embedded Supports/Accommodations Challenges P01-Text to speech-for math stimuli items and ELA items (not passages) P13-TTS including passages P17-Braille Online P36- Closed captioning P34 ASL video Decision making about read aloud vs TTS TTS with or without passages Braille paper (P03) vs online ASL video versus SEE Separate Setting

25 Supporting ELLs Embedded Language Supports Challenges
P30-Translate Test Directions P-31Translation glossaries P32-Translations stacked Diversity of language Confusion of languages supported for each code Using language proficiency results to inform selection of testing supports

26 Were there performance differences across students who used the features and those who did not?

27 2) performance differences across students who used the feature and those that did not;

28 PARCC Performance Levels
PARCC uses five performance levels that delineate the knowledge, skills, and practices students are able to demonstrate: Level 1: Did Not Yet Meet Expectations Level 2: Partially Met Expectations Level 3: Approached Expectations Level 4: Met Expectations Level 5: Exceeded Expectations Place a purple frame around images

29 ELA/L Grade 3

30

31

32

33

34

35 Summary of PARCC Text-to-Speech Form Use – CBT in Year 1
Grade PBA Total EOY Total 3 ELA 7,673 7,695 4 ELA 10,806 10,812 5 ELA 11,286 11,168 6 ELA 11,538 11,296 7 ELA 10,968 10,710 8 ELA 10,497 10,000 9 ELA 6,590 6,315 10 ELA 2,609 2,406 11 ELA 1,924 1,729 Grade PBA Total EOY Total 3 Math 108,977 122,320 4 Math 119,987 124,630 5 Math 121,002 120,426 6 Math 137,021 130,407 7 Math 126,039 122,146 8 Math 110,629 103,680 Algebra I 91,307 85,511 Geometry 26,489 21,132 Algebra II 23,368 20,107 Integrated Math I 4,683 4,417 Integrated Math II 776 774 Integrated Math III 232 208

36 Summary of Accessibility/ Accommodated Form Use – CBT Year 1
Accommodation/ Accessibility Feature PBA Total EOY Total AT Screen Reader 1,444 824 ASL 1,326 1,341 Spanish Translation 13,530 12,340 Spanish Text-to-Speech 7,560 6,790 Closed Captioning 1,990 2,026

37 Performance Differences
Spring 2015 M-STEP student results ELA Math Performance Level Descriptors: PL 1: Not proficient PL 2: Partially proficient PL 3: Proficient PL 4: Advanced

38 2015 Performance Differences - ELA
Support/Accommodation NCount AvgPL Used Supports/Accommodations (General Education Students) 23,724 2.1 No Supports/Accommodation (General Education Students) 615,442 2.5 Used Supports/Accommodations (SWD) 37,515 1.4 No Supports/Accommodation (SWD) 42,691 1.7 Used Supports/Accommodations (EL) 10,158 1.6 No Supports/Accommodation (EL) 32,079 1.9

39 2015 Performance Differences - ELA
Text to Speech item level (ELA: Grades 3-5) 30,915 1.7 Text to Speech passage level (ELA: Grades 6-8) 17,815 1.4 Video Sign Language 49 1.6 Braille Form – paper/pencil 39 2.3

40 2015 Performance Differences - Math
Support/Accommodation NCount AvgPL Used Supports/Accommodations (General Education Students) 26,542 1.9 No Supports/Accommodation (General Education Students) 611,956 2.3 Used Supports/Accommodations (SWD) 44,484 1.3 No Supports/Accommodation (SWD) 35,831 1.6 Used Supports/Accommodations (EL) 13,119 No Supports/Accommodation (EL) 30,392 1.8

41 2015 Performance Differences - Math
Spanish Translation (Stacked) 1,141 1.5 Text to Speech item level (Math: Grades 3-8) 57,681 Video Sign Language 52 1.6 Braille Form – paper/pencil 39 1.7

42 Accommodations & Performance
Discuss performance differences across students who used the feature and those that did not. 2014 SSDI Day 1

43 Overall Proficiency

44 Proficiency & Vision Impairment/Blindness
2014 SSDI Day 1

45 Proficiency & Deafness
ASL 2014 SSDI Day 1

46 Proficiency & Hard of Hearing
Level 1

47 What are the lessons learned about implementation and related challenges?

48 3.) Lessons learned about implementation and related challenges

49 Lesson Learned from Implementation of SR/PNP and Testing Platform….
PARCC states have the ability to capture accessibility features, administrative considerations and accommodations data via the Student Registration/Personal Needs Profile (SR/PNP) file. Operational reports which documents all accessibility features, administrative considerations and accommodations at the school, district and state level. Some states are using this data to monitor the selection of certain accessibility features and accommodations. Some test administrators were not sure on which accessibility feature or accommodation to select and during testing changes had to be made to provide the student with the correct form.

50 Lesson Learned from Implementation of SR/PNP and Testing Platform….
Some schools and districts completed the SR/PNP file during the additional order window which caused materials to arrive late in schools. Additional training and guidance continues to be refined based on input from the field each administration cycle. Some states will transition to United English Braille in Year 3. Additional guidance related to the SR/PNP and Before and After Testing was added to the PARCC Accessibility Features and Accommodations Manual For Year 3, letter designations from the PARCC Accessibility Features and Accommodations Manual and SR/PNP File Layout document will be added to both document.

51 Example of Test Administration Guidance

52 Student Registration /Personal Needs Profile Lesson Learned….
Student Registration File and Personal Needs Profile was combined into one file (SR/PNP) in Year 2 of the Operational Assessments. Benefits: All student information will reside in one database versus two separate ones Test administrators can pull reports from the file to ensure every student has the necessary features/accommodations on testing day File serves as a record of student’s testing profile Will reduce administrative burden

53 Training on the SR/PNP SR/PNP Educator Training Module
Accessibility Features & Accommodations Training Module SR/PNP Field Definitions Guide State-Developed Trainings

54 Challenges Over identification for use
Lack of understanding of support/accommodation Lack of understanding of student’s abilities Only 1 fully translated language for Math - Spanish

55 Lessons Learned & Challenges
Discuss lessons learned about implementation and related challenges. 2014 SSDI Day 1

56 Lessons Learned Emphasize the instruction-assessment connection by infusing it throughout the system (plans and data systems). 2014 SSDI Day 1

57 Looking Forward Professional development for selection of tools, supports and accommodations Interactive sandbox Opportunities to use tools beyond the practice test Emphasis on how interims and diagnostics support summative Ongoing monitoring prior to summative window

58 Discussion Cara Laitusis

59 Last year’s themes Most Accessible Assessments “Too Much” Surprises
Training is essential Surprises Researcher Envy Data and monitoring is impressive

60 One year later STILL impressed by how far we have come in terms of accessibility features that have survived through to year 2 Performance Gaps (All students vs students with disabilities) Data I miss Audra!

61 Accessibility in K12 K12 students are getting better accommodation and accessibility than any other assessment (licensure, admissions, certification, language testing) How do we know the accommodations are working for different groups? TTS for auditory processing So many accommodations are not universal tools How will this impact future accommodation requests “Past Testing Accommodations. Proof of past testing accommodations in similar test settings is generally sufficient to support a request for the same testing accommodations for a current standardized exam or other high-stakes test.”

62

63 Performance Gaps! Differences seem larger but need to know more about why. Increased rigor of the state standards (i.e. harder tests)? ceiling effect on prior tests Adapting to computer based testing? Learning curve with new assistive technologies (e.g., text to speech instead of human readers)? All of the above Need to dig a little more into this as trend data comes in e.g., looking into accommodations changes for the ELs and comparing those groups

64 Data Still need for better data integration Still have ‘research envy’
Promise for better data across low incidence disabilities through consortia assessments not fully realized Still have ‘research envy’ Suggest states work together through the ASES SCASS to summarize data across states Start small (one accommodation across different groups) Progress on capturing features and accommodation needs is impressive as well as general use of features Still need better information on feature use at the item level Impressed by WV process of collecting data on accommodation use but hope we can make that automated in the future.

65 Data Capture Need for improved data structure to allow for data capture during testing. Lessons learned from NAEP @caralaitusis

66 Few more thoughts Lots of lessons learned from OECD tests on language translation screen formatting issues for longer languages Right to left vs left to right languages TTS challenges Professional Development, Professional Development, Professional Development What works best (online, in person, practice tests)? How to transition knowledge with staff turn over? Language consistency across assessments

67 Trinell Bowman Melissa Gholson Jen Paul


Download ppt "Delivery of Test Accessibility Tools and Accommodations Across 3 Platforms: Implementation, Success, and Lessons Learned Trinell Bowman, Maryland Melissa."

Similar presentations


Ads by Google